New Year Special 75% Discount offer - Ends in 0d 00h 00m 00s - Coupon code: 75brite

Amazon Web Services Updated SAP-C02 Exam Questions and Answers by harlan

Page: 8 / 43

Amazon Web Services SAP-C02 Exam Overview :

Exam Name: AWS Certified Solutions Architect - Professional
Exam Code: SAP-C02 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Professional
Questions: 605 Q&A's Shared By: harlan
Question 32

A solutions architect needs to improve an application that is hosted in the AWS Cloud. The application uses an Amazon Aurora MySQL DB instance that is experiencing overloaded connections. Most of the application's operations insert records into the database. The application currently stores credentials in a text-based configuration file.

The solutions architect needs to implement a solution so that the application can handle the current connection load. The solution must keep the credentials secure and must provide the ability to rotatethe credentials automatically on a regular basis.

Which solution will meet these requirements?

Options:

A.

Deploy an Amazon RDS Proxy layer in front of the DB instance. Store the connection credentials as a secret in AWS Secrets Manager.

B.

Deploy an Amazon RDS Proxy layer in front of the DB instance. Store the connection credentials in AWS Systems Manager Parameter Store.

C.

Create an Aurora Replica. Store the connection credentials as a secret in AWS Secrets Manager.

D.

Create an Aurora Replica. Store the connection credentials in AWS Systems Manager Parameter Store.

Discussion
Syeda
I passed, Thank you Cramkey for your precious Dumps.
Stella Dec 28, 2025
That's great. I think I'll give Cramkey Dumps a try.
Osian
Dumps are fantastic! I recently passed my certification exam using these dumps and I must say, they are 100% valid.
Azaan Dec 6, 2025
They are incredibly accurate and valid. I felt confident going into my exam because the dumps covered all the important topics and the questions were very similar to what I saw on the actual exam. The team of experts behind Cramkey Dumps make sure the information is relevant and up-to-date.
Ivan
I tried these dumps for my recent certification exam and I found it pretty helpful.
Elis Dec 28, 2025
Agree!!! The questions in the dumps were quite similar to what came up in the actual exam. It gave me a good idea of the types of questions to expect and helped me revise efficiently.
Norah
Cramkey is highly recommended.
Zayan Dec 22, 2025
Definitely. If you're looking for a reliable and effective study resource, look no further than Cramkey Dumps. They're simply wonderful!
Question 33

A company uses an Amazon Redshift cluster to ingest data from various sources. The data is shared with other internal applications for analysis and reporting.

The cluster has eight ra3.4xlarge nodes. Data ingestion runs daily from midnight to 8 AM and takes 3 hours. The cluster has 85% average CPU utilization during ingestion. The cluster uses on-demand node pricing and is paused outside of the 8-hour daily ingestion window. Snapshots are enabled on the cluster.

The company wants to optimize this workload to reduce costs.

Which solution will meet these requirements?

Options:

A.

Create a new Redshift cluster with eight ra3.4xlarge nodes in concurrency scaling mode by using the most recent snapshot from the existing cluster. Modify the internal applications to retrieve data from the new Redshift cluster. Shut down the existing Redshift cluster. Purchase eight 1-year All Upfront Redshift reserved nodes.

B.

Create a new Redshift cluster with six ra3.16xlarge nodes by using the most recent snapshot from the existing cluster. Enable auto scaling. Modify the internal applications to retrieve data from the new Redshift cluster. Shut down the existing Redshift cluster.

C.

Create a new Redshift Serverless endpoint with 64 Redshift Processing Units (RPUs) by using the most recent snapshot from the existing Redshift cluster. Update the internal applications to retrieve data from the new Redshift Serverless endpoint. Delete the existing Redshift cluster.

D.

Configure Redshift Spectrum on the existing Redshift cluster. Set up IAM permissions to allow Redshift Spectrum to access Amazon S3. Unload data from the existing cluster to an S3 bucket. Update the internal applications to query the S3 data.

Discussion
Question 34

To abide by industry regulations, a solutions architect must design a solution that will store a company's critical data in multiple public AWS Regions, including in the United States, where the company's headquarters is located The solutions architect is required to provide access to the data stored in AWS to the company's global WAN network The security team mandates that no traffic accessing this data should traverse the public internet

How should the solutions architect design a highly available solution that meets the requirements and is cost-effective'?

Options:

A.

Establish AWS Direct Connect connections from the company headquarters to all AWS Regions in use the company WAN to send traffic over to the headquarters and then to the respective DX connection to access the data

B.

Establish two AWS Direct Connect connections from the company headquarters to an AWS Region Use the company WAN to send traffic over a DX connection Use inter-region VPC peering to access the data in other AWS Regions

C.

Establish two AWS Direct Connect connections from the company headquarters to an AWS Region Use the company WAN to send traffic over a DX connection Use an AWS transit VPC solution to access data in other AWS Regions

D.

Establish two AWS Direct Connect connections from the company headquarters to an AWS Region Use the company WAN to send traffic over a DX connection Use Direct Connect Gateway to access data in other AWS Regions.

Discussion
Question 35

A life sciences company is using a combination of open source tools to manage data analysis workflows and Docker containers running on servers in its on-premises data center to process genomics data Sequencing data is generated and stored on a local storage area network (SAN), and then the data is processed. The research and development teams are running into capacity issues and have decided to re-architect their genomics analysis platform on AWS to scale based on workload demands and reduce the turnaround time from weeks to days

The company has a high-speed AWS Direct Connect connection Sequencers will generate around 200 GB of data for each genome, and individual jobs can take several hours to process the data with ideal compute capacity. The end result will be stored in Amazon S3. The company is expecting 10-15 job requests each day

Which solution meets these requirements?

Options:

A.

Use regularly scheduled AWS Snowball Edge devices to transfer the sequencing data into AWS When AWS receives the Snowball Edge device and the data is loaded into Amazon S3 use S3 events to trigger an AWS Lambda function to process the data

B.

Use AWS Data Pipeline to transfer the sequencing data to Amazon S3 Use S3 events to trigger an Amazon EC2 Auto Scaling group to launch custom-AMI EC2 instances running the Docker containers to process the data

C.

Use AWS DataSync to transfer the sequencing data to Amazon S3 Use S3 events to trigger an AWS Lambda function that starts an AWS Step Functions workflow Store the Docker images in Amazon Elastic Container Registry (Amazon ECR) and trigger AWS Batch to run the container and process the sequencing data

D.

Use an AWS Storage Gateway file gateway to transfer the sequencing data to Amazon S3 Use S3 events to trigger an AWS Batch job that runs on Amazon EC2 instances running the Docker containers to process the data

Discussion
Page: 8 / 43
Title
Questions
Posted

SAP-C02
PDF

$26.25  $104.99

SAP-C02 Testing Engine

$31.25  $124.99

SAP-C02 PDF + Testing Engine

$41.25  $164.99