Weekend Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by lilia

Page: 5 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: lilia
Question 20

A company uses an Amazon Redshift provisioned cluster for data analysis. The data is not encrypted at rest. A data analytics specialist must implement a solution to encrypt the data at rest.

Which solution will meet this requirement with the LEAST operational overhead?

Options:

A.

Use the ALTER TABLE command with the ENCODE option to update existing columns of the Redshift tables to use LZO encoding.

B.

Export data from the existing Redshift cluster to Amazon S3 by using the UNLOAD command with the ENCRYPTED option. Create a new Redshift cluster with encryption configured. Load data into the new cluster by using the COPY command.

C.

Create a manual snapshot of the existing Redshift cluster. Restore the snapshot into a new Redshift cluster with encryption configured.

D.

Modify the existing Redshift cluster to use AWS Key Management Service (AWS KMS) encryption. Wait for the cluster to finish resizing.

Discussion
Question 21

A company’s data analyst needs to ensure that queries executed in Amazon Athena cannot scan more than a prescribed amount of data for cost control purposes. Queries that exceed the prescribed threshold must be canceled immediately.

What should the data analyst do to achieve this?

Options:

A.

Configure Athena to invoke an AWS Lambda function that terminates queries when the prescribed threshold is crossed.

B.

For each workgroup, set the control limit for each query to the prescribed threshold.

C.

Enforce the prescribed threshold on all Amazon S3 bucket policies

D.

For each workgroup, set the workgroup-wide data usage control limit to the prescribed threshold.

Discussion
Question 22

A company wants to collect and process events data from different departments in near-real time. Before storing the data in Amazon S3, the company needs to clean the data by standardizing the format of the address and timestamp columns. The data varies in size based on the overall load at each particular point in time. A single data record can be 100 KB-10 MB.

How should a data analytics specialist design the solution for data ingestion?

Options:

A.

Use Amazon Kinesis Data Streams. Configure a stream for the raw data. Use a Kinesis Agent to write data to the stream. Create an Amazon Kinesis Data Analytics application that reads data from the raw stream, cleanses it, and stores the output to Amazon S3.

B.

Use Amazon Kinesis Data Firehose. Configure a Firehose delivery stream with a preprocessing AWS Lambda function for data cleansing. Use a KinesisAgent to write data to the delivery stream. Configure Kinesis Data Firehose to deliver the data to Amazon S3.

C.

Use Amazon Managed Streaming for Apache Kafka. Configure a topic for the raw data. Use a Kafka producer to write data to the topic. Create an application on Amazon EC2 that reads data from the topic by using the Apache Kafka consumer API, cleanses the data, and writes to Amazon S3.

D.

Use Amazon Simple Queue Service (Amazon SQS). Configure an AWS Lambda function to read events from the SQS queue and upload the events to Amazon S3.

Discussion
Rae
I tried using Cramkey dumps for my recent certification exam and I found them to be more accurate and up-to-date compared to other dumps I've seen. Passed the exam with wonderful score.
Rayyan Sep 14, 2024
I see your point. Thanks for sharing your thoughts. I might give it a try for my next certification exam.
Laila
They're such a great resource for anyone who wants to improve their exam results. I used these dumps and passed my exam!! Happy customer, always prefer. Yes, same questions as above I know you guys are perfect.
Keira Aug 12, 2024
100% right….And they're so affordable too. It's amazing how much value you get for the price.
Hendrix
Great website with Great Exam Dumps. Just passed my exam today.
Luka Aug 31, 2024
Absolutely. Cramkey Dumps only provides the latest and most updated exam questions and answers.
Walter
Yayyy!!! I passed my exam with the help of Cramkey Dumps. Highly appreciated!!!!
Angus Nov 4, 2024
YES….. I saw the same questions in the exam.
Question 23

A company owns facilities with IoT devices installed across the world. The company is using Amazon Kinesis Data Streams to stream data from the devices to Amazon S3. The company's operations team wants to get insights from the IoT data to monitor data quality at ingestion. The insights need to be derived in near-real time, and the output must be logged to Amazon DynamoDB for further analysis.

Which solution meets these requirements?

Options:

A.

Connect Amazon Kinesis Data Analytics to analyze the stream data. Save the output to DynamoDB by using the default output from Kinesis Data Analytics.

B.

Connect Amazon Kinesis Data Analytics to analyze the stream data. Save the output to DynamoDB by using an AWS Lambda function.

C.

Connect Amazon Kinesis Data Firehose to analyze the stream data by using an AWS Lambda function. Save the output to DynamoDB by using the default output from Kinesis Data Firehose.

D.

Connect Amazon Kinesis Data Firehose to analyze the stream data by using an AWS Lambda function. Save the data to Amazon S3. Then run an AWS Glue job on schedule to ingest the data into DynamoDB.

Discussion
Page: 5 / 14

DAS-C01
PDF

$36.75  $104.99

DAS-C01 Testing Engine

$43.75  $124.99

DAS-C01 PDF + Testing Engine

$57.75  $164.99