Month End Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DBS-C01 Exam Questions and Answers by destiny

Page: 10 / 23

Amazon Web Services DBS-C01 Exam Overview :

Exam Name: AWS Certified Database - Specialty
Exam Code: DBS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Database
Questions: 324 Q&A's Shared By: destiny
Question 40

An online gaming company is using an Amazon DynamoDB table in on-demand mode to store game scores. After an intensive advertisement campaign in South

America, the average number of concurrent users rapidly increases from 100,000 to 500,000 in less than 10 minutes every day around 5 PM.

The on-call software reliability engineer has observed that the application logs contain a high number of DynamoDB throttling exceptions caused by game score insertions around 5 PM. Customer service has also reported that several users are complaining about their scores not being registered.

How should the database administrator remediate this issue at the lowest cost?

Options:

A.

Enable auto scaling and set the target usage rate to 90%.

B.

Switch the table to provisioned mode and enable auto scaling.

C.

Switch the table to provisioned mode and set the throughput to the peak value.

D.

Create a DynamoDB Accelerator cluster and use it to access the DynamoDB table.

Discussion
Andrew
Are these dumps helpful?
Jeremiah Oct 27, 2024
Yes, Don’t worry!!! I'm confident you'll find them to be just as helpful as I did. Good luck with your exam!
Lennie
I passed my exam and achieved wonderful score, I highly recommend it.
Emelia Oct 2, 2024
I think I'll give Cramkey a try next time I take a certification exam. Thanks for the recommendation!
Joey
I highly recommend Cramkey Dumps to anyone preparing for the certification exam. They have all the key information you need and the questions are very similar to what you'll see on the actual exam.
Dexter Aug 7, 2024
Agreed. It's definitely worth checking out if you're looking for a comprehensive and reliable study resource.
Norah
Cramkey is highly recommended.
Zayan Oct 17, 2024
Definitely. If you're looking for a reliable and effective study resource, look no further than Cramkey Dumps. They're simply wonderful!
Question 41

A company plans to migrate a MySQL-based application from an on-premises environment to AWS. The application performs database joins across several tables and uses indexes for faster query response times. The company needs the database to be highly available with automatic failover.

Which solution on AWS will meet these requirements with the LEAST operational overhead?

Options:

A.

Deploy an Amazon RDS DB instance with a read replica.

B.

Deploy an Amazon RDS Multi-AZ DB instance.

C.

Deploy Amazon DynamoDB global tables.

D.

Deploy multiple Amazon RDS DB instances. Use Amazon Route 53 DNS with failover health checks configured.

Discussion
Question 42

To meet new data compliance requirements, a company needs to keep critical data durably stored and readily accessible for 7 years. Data that is more than 1 year old is considered archival data and must automatically be moved out of the Amazon Aurora MySQL DB cluster every week. On average, around 10 GB of new data is added to the database every month. A database specialist must choose the most operationally efficient solution to migrate the archival data to Amazon S3.

Which solution meets these requirements?

Options:

A.

Create a custom script that exports archival data from the DB cluster to Amazon S3 using a SQL view, then deletes the archival data from the DB cluster. Launch an Amazon EC2 instance with a weekly cron job to execute the custom script.

B.

Configure an AWS Lambda function that exports archival data from the DB cluster to Amazon S3 using a SELECT INTO OUTFILE S3 statement, then deletes the archival data from the DB cluster. Schedule the Lambda function to run weekly using Amazon EventBridge (Amazon CloudWatch Events).

C.

Configure two AWS Lambda functions: one that exports archival data from the DB cluster to Amazon S3 using the mysqldump utility, and another that deletes the archival data from the DB cluster. Schedule both Lambda functions to run weekly using Amazon EventBridge (Amazon CloudWatch Events).

D.

Use AWS Database Migration Service (AWS DMS) to continually export the archival data from the DB cluster to Amazon S3. Configure an AWS Data Pipeline process to run weekly that executes a custom SQL script to delete the archival data from the DB cluster.

Discussion
Question 43

A manufacturing company has an. inventory system that stores information in an Amazon Aurora MySQL DB cluster. The database tables are partitioned. The database size has grown to 3 TB. Users run one-time queries by using a SQL client. Queries that use an equijoin to join large tables are taking a long time to run.

Which action will improve query performance with the LEAST operational effort?

Options:

A.

Migrate the database to a new Amazon Redshift data warehouse.

B.

Enable hash joins on the database by setting the variable optimizer_switch to hash_join=on.

C.

Take a snapshot of the DB cluster. Create a new DB instance by using the snapshot, and enable parallel query mode.

D.

Add an Aurora read replica.

Discussion
Page: 10 / 23
Title
Questions
Posted

DBS-C01
PDF

$36.75  $104.99

DBS-C01 Testing Engine

$43.75  $124.99

DBS-C01 PDF + Testing Engine

$57.75  $164.99