Month End Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated SAA-C03 Exam Questions and Answers by alaia

Page: 4 / 77

Amazon Web Services SAA-C03 Exam Overview :

Exam Name: AWS Certified Solutions Architect - Associate (SAA-C03)
Exam Code: SAA-C03 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Associate
Questions: 1039 Q&A's Shared By: alaia
Question 16

A company will deployed a web application on AWS. The company hosts the backend database on Amazon RDS for MySQL with a primary DB instance and five read replicas to support scaling needs. The read replicas must log no more than 1 second bahind the primary DB Instance. The database routinely runs scheduled stored procedures.

As traffic on the website increases, the replicas experinces addtional lag during periods of peak lead. A solutions architect must reduce the replication lag as much as possible. The solutions architect must minimize changes to the applicatin code and must minimize ongoing overhead.

Which solution will meet these requirements?

Migrate the database to Amazon Aurora MySQL. Replace the read replicas with Aurora Replicas, and configure Aurora Auto Scaling. Replace the stored procedures with Aurora MySQL native functions.

Deploy an Amazon ElasticCache for Redis cluser in front of the database. Modify the application to check the cache before the application queries the database. Repace the stored procedures with AWS Lambda funcions.

Options:

A.

Migrate the database to a MYSQL database that runs on Amazn EC2 instances. Choose large, compute optimized for all replica nodes. Maintain the stored procedures on the EC2 instances.

B.

Deploy an Amazon ElastiCache for Redis cluster in fornt of the database. Modify the application to check the cache before the application queries the database. Replace the stored procedures with AWS Lambda functions.

C.

Migrate the database to a MySQL database that runs on Amazon EC2 instances. Choose large, compute optimized EC2 instances for all replica nodes, Maintain the stored procedures on the EC2 instances.

D.

Migrate the database to Amazon DynamoDB, Provision number of read capacity units (RCUs) to support the required throughput, and configure on-demand capacity scaling. Replace the stored procedures with DynamoDB streams.

Discussion
Question 17

A company s order system sends requests from clients to Amazon EC2 instances The EC2 instances process the orders and men store the orders in a database on Amazon RDS Users report that they must reprocess orders when the system fails. The company wants a resilient solution that can process orders automatically it a system outage occurs.

What should a solutions architect do to meet these requirements?

Options:

A.

Move (he EC2 Instances into an Auto Scaling group Create an Amazon EventBridge (Amazon CloudWatch Events) rule to target an Amazon Elastic Container Service (Amazon ECS) task

B.

Move the EC2 instances into an Auto Scaling group behind an Application Load Balancer (ALB) Update the order system to send messages to the ALB endpoint.

C.

Move the EC2 instances into an Auto Scaling group Configure the order system to send messages to an Amazon Simple Queue Service (Amazon SQS) queue Configure the EC2 instances to consume messages from the queue

D.

Create an Amazon Simple Notification Service (Amazon SNS) topic Create an AWS Lambda function, and subscribe the function to the SNS topic Configure the order system to send messages to the SNS topic Send a command to the EC2 instances to process the messages by using AWS Systems Manager Run Command

Discussion
Andrew
Are these dumps helpful?
Jeremiah Oct 27, 2024
Yes, Don’t worry!!! I'm confident you'll find them to be just as helpful as I did. Good luck with your exam!
Stefan
Thank you so much Cramkey I passed my exam today due to your highly up to date dumps.
Ocean Aug 31, 2024
Agree….Cramkey Dumps are constantly updated based on changes in the exams. They also have a team of experts who regularly review the materials to ensure their accuracy and relevance. This way, you can be sure you're studying the most up-to-date information available.
Alessia
Amazing Dumps. Found almost all questions in actual exam whih I prepared from these valuable dumps. Recommended!!!!
Belle Nov 2, 2024
That's impressive. I've been struggling with finding good study material for my certification. Maybe I should give Cramkey Dumps a try.
Reeva
Wow what a success I achieved today. Thank you so much Cramkey for amazing Dumps. All students must try it.
Amari Sep 1, 2024
Wow, that's impressive. I'll definitely keep Cramkey in mind for my next exam.
Amy
I passed my exam and found your dumps 100% relevant to the actual exam.
Lacey Aug 9, 2024
Yeah, definitely. I experienced the same.
Question 18

A company has an application that places hundreds of .csv files into an Amazon S3 bucket every hour. The files are 1 GB in size. Each time a file is uploaded, the company needs to convert the file to Apache Parquet format and place the output file into an S3 bucket.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.

Create an AWS Lambda function to download the .csv files, convert the files to Parquet format, and place the output files in an S3 bucket. Invoke the Lambda function for each S3 PUT event.

B.

Create an Apache Spark job to read the .csv files, convert the files to Parquet format, and place the output files in an S3 bucket. Create an AWS Lambda function for each S3 PUT event to invoke the Spark job.

C.

Create an AWS Glue table and an AWS Glue crawler for the S3 bucket where the application places the .csv files. Schedule an AWS Lambda function to periodically use Amazon Athena to query the AWS Glue table, convert the query results into Parquet format, and place the output files into an S3 bucket.

D.

Create an AWS Glue extract, transform, and load (ETL) job to convert the .csv files to Parquet format and place the output files into an S3 bucket. Create an AWS Lambda function for each S3 PUT event to invoke the ETL job.

Discussion
Question 19

A company has an On-premises volume backup solution that has reached its end of life. The company wants to use AWS as part of a new backup solution and wants to maintain local access to all the data while it is backed up on AWS. The company wants to ensure that the data backed up on AWS is automatically and securely transferred.

Which solution meets these requirements?

Options:

A.

Use AWS Snowball to migrate data out of the on-premises solution to Amazon S3. Configure on-premises systems to mount the Snowball S3 endpoint to provide local access to the data.

B.

Use AWS Snowball Edge to migrate data out of the on-premises solution to Amazon S3.Use the Snowball Edge file interface to provide on-premises systems with local access to the data.

C.

Use AWS Storage Gateway and configure a cached volume gateway. Run the Storage Gateway software application on premises and configure a percentage of data to cache locally. Mount the gateway storage volumes to provide local access to the data.

D.

Use AWS Storage Gateway and configure a stored volume gateway. Run the Storage software application on premises and map the gateway storage volumes to on-premises storage. Mount the gateway storage volumes to provide local access to the data.

Discussion
Page: 4 / 77
Title
Questions
Posted

SAA-C03
PDF

$36.75  $104.99

SAA-C03 Testing Engine

$43.75  $124.99

SAA-C03 PDF + Testing Engine

$57.75  $164.99