Big Black Friday Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DBS-C01 Exam Questions and Answers by alanna

Page: 9 / 23

Amazon Web Services DBS-C01 Exam Overview :

Exam Name: AWS Certified Database - Specialty
Exam Code: DBS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Database
Questions: 324 Q&A's Shared By: alanna
Question 36

An application reads and writes data to an Amazon RDS for MySQL DB instance. A new reporting dashboard needs read-only access to the database. When the application and reports are both under heavy load, the database experiences performance degradation. A database specialist needs to improve the database performance.

What should the database specialist do to meet these requirements?

Options:

A.

Create a read replica of the DB instance. Configure the reports to connect to the replication instance endpoint.

B.

Create a read replica of the DB instance. Configure the application and reports to connect to the cluster endpoint.

C.

Enable Multi-AZ deployment. Configure the reports to connect to the standby replica.

D.

Enable Multi-AZ deployment. Configure the application and reports to connect to the cluster endpoint.

Discussion
Question 37

A company is concerned about the cost of a large-scale, transactional application using Amazon DynamoDB that only needs to store data for 2 days before it is deleted. In looking at the tables, a Database Specialist notices that much of the data is months old, and goes back to when the application was first deployed.

What can the Database Specialist do to reduce the overall cost?

Options:

A.

Create a new attribute in each table to track the expiration time and create an AWS Glue transformation to delete entries more than 2 days old.

B.

Create a new attribute in each table to track the expiration time and enable DynamoDB Streams on each table.

C.

Create a new attribute in each table to track the expiration time and enable time to live (TTL) on each table.

D.

Create an Amazon CloudWatch Events event to export the data to Amazon S3 daily using AWS Data Pipeline and then truncate the Amazon DynamoDB table.

Discussion
Ivan
I tried these dumps for my recent certification exam and I found it pretty helpful.
Elis Oct 11, 2025
Agree!!! The questions in the dumps were quite similar to what came up in the actual exam. It gave me a good idea of the types of questions to expect and helped me revise efficiently.
Madeleine
Passed my exam with my dream score…. Guys do give these dumps a try. They are authentic.
Ziggy Oct 16, 2025
That's really impressive. I think I might give Cramkey Dumps a try for my next certification exam.
Mariam
Do anyone think Cramkey questions can help improve exam scores?
Katie Oct 25, 2025
Absolutely! Many people have reported improved scores after using Cramkey Dumps, and there are also success stories of people passing exams on the first try. I already passed this exam. I confirmed above questions were in exam.
Syeda
I passed, Thank you Cramkey for your precious Dumps.
Stella Oct 15, 2025
That's great. I think I'll give Cramkey Dumps a try.
Neve
Will I be able to achieve success after using these dumps?
Rohan Oct 9, 2025
Absolutely. It's a great way to increase your chances of success.
Question 38

A significant automotive manufacturer is switching a mission-critical finance application's database to Amazon DynamoDB. According to the company's risk and compliance policy, any update to the database must be documented as a log entry for auditing purposes. Each minute, the system anticipates about 500,000 log entries. Log entries should be kept in Apache Parquet files in batches of at least 100,000 records per file.

How could a database professional approach these needs while using DynamoDB?

Options:

A.

Enable Amazon DynamoDB Streams on the table. Create an AWS Lambda function triggered by the stream. Write the log entries to an Amazon S3 object.

B.

Create a backup plan in AWS Backup to back up the DynamoDB table once a day. Create an AWS Lambda function that restores the backup in another table and compares both tables for changes. Generate the log entries and write them to an Amazon S3 object.

C.

Enable AWS CloudTrail logs on the table. Create an AWS Lambda function that reads the log files once an hour and filters DynamoDB API actions. Write the filtered log files to Amazon S3.

D.

Enable Amazon DynamoDB Streams on the table. Create an AWS Lambda function triggered by the stream. Write the log entries to an Amazon Kinesis Data Firehose delivery stream with buffering and Amazon S3 as the destination.

Discussion
Question 39

A company recently migrated its line-of-business (LOB) application to AWS. The application uses an Amazon RDS for SQL Server DB instance as its database engine.

The company must set up cross-Region disaster recovery for the application. The company needs a solution with the lowest possible RPO and RTO.

Which solution will meet these requirements?

Options:

A.

Create a cross-Region read replica of the DB instance. Promote the read replica at the time of failover.

B.

Set up SQL replication from the DB instance to an Amazon EC2 instance in the disaster recovery Region. Promote the EC2 instance as the primary server.

C.

Use AWS Database Migration Service (AWS KMS) for ongoing replication of the DB instance in the disaster recovery Region.

D.

Take manual snapshots of the DB instance in the primary Region. Copy the snapshots to the disaster recovery Region.

Discussion
Page: 9 / 23
Title
Questions
Posted

DBS-C01
PDF

$36.75  $104.99

DBS-C01 Testing Engine

$43.75  $124.99

DBS-C01 PDF + Testing Engine

$57.75  $164.99