Month End Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by blossom

Page: 10 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: blossom
Question 40

A large company receives files from external parties in Amazon EC2 throughout the day. At the end of the day, the files are combined into a single file, compressed into a gzip file, and uploaded to Amazon S3. The total size of all the files is close to 100 GB daily. Once the files are uploaded to Amazon S3, an AWS Batch program executes a COPY command to load the files into an Amazon Redshift cluster.

Which program modification will accelerate the COPY process?

Options:

A.

Upload the individual files to Amazon S3 and run the COPY command as soon as the files become available.

B.

Split the number of files so they are equal to a multiple of the number of slices in the Amazon Redshift cluster. Gzip and upload the files to Amazon S3. Run the COPY command on the files.

C.

Split the number of files so they are equal to a multiple of the number of compute nodes in the Amazon Redshift cluster. Gzip and upload the files to Amazon S3. Run the COPY command on the files.

D.

Apply sharding by breaking up the files so the distkey columns with the same values go to the same file. Gzip and upload the sharded files to Amazon S3. Run the COPY command on the files.

Discussion
Question 41

A company has a data lake on AWS that ingests sources of data from multiple business units and uses Amazon Athena for queries. The storage layer is Amazon S3 using the AWS Glue Data Catalog. The company wants to make the data available to its data scientists and business analysts. However, the company first needs to manage data access for Athena based on user roles and responsibilities.

What should the company do to apply these access controls with the LEAST operational overhead?

Options:

A.

Define security policy-based rules for the users and applications by role in AWS Lake Formation.

B.

Define security policy-based rules for the users and applications by role in AWS Identity and Access Management (IAM).

C.

Define security policy-based rules for the tables and columns by role in AWS Glue.

D.

Define security policy-based rules for the tables and columns by role in AWS Identity and Access Management (IAM).

Discussion
Nylah
I've been looking for good study material for my upcoming certification exam. Need help.
Dolly Oct 3, 2024
Then you should definitely give Cramkey Dumps a try. They have a huge database of questions and answers, making it easy to study and prepare for the exam. And the best part is, you can be sure the information is accurate and relevant.
Conor
I recently used these dumps for my exam and I must say, I was impressed with their authentic material.
Yunus Sep 13, 2024
Exactly…….The information in the dumps is so authentic and up-to-date. Plus, the questions are very similar to what you'll see on the actual exam. I felt confident going into the exam because I had studied using Cramkey Dumps.
Ace
No problem! I highly recommend Cramkey Dumps to anyone looking to pass their certification exams. They will help you feel confident and prepared on exam day. Good luck!
Harris Oct 31, 2024
That sounds amazing. I'll definitely check them out. Thanks for the recommendation!
Ava-Rose
Yes! Cramkey Dumps are amazing I passed my exam…Same these questions were in exam asked.
Ismail Sep 18, 2024
Wow, that sounds really helpful. Thanks, I would definitely consider these dumps for my certification exam.
Annabel
I recently used them for my exam and I passed it with excellent score. I am impressed.
Amirah Oct 28, 2024
I passed too. The questions I saw in the actual exam were exactly the same as the ones in the Cramkey Dumps. I was able to answer the questions confidently because I had already seen and studied them.
Question 42

A company has several Amazon EC2 instances sitting behind an Application Load Balancer (ALB) The company wants its IT Infrastructure team to analyze the IP addresses coming into the company's ALB The ALB is configured to store access logs in Amazon S3 The access logs create about 1 TB of data each day, and access to the data will be infrequent The company needs a solution that is scalable, cost-effective and has minimal maintenance requirements

Which solution meets these requirements?

Options:

A.

Copy the data into Amazon Redshift and query the data

B.

Use Amazon EMR and Apache Hive to query the S3 data

C.

Use Amazon Athena to query the S3 data

D.

Use Amazon Redshift Spectrum to query the S3 data

Discussion
Question 43

A company is streaming its high-volume billing data (100 MBps) to Amazon Kinesis Data Streams. A data analyst partitioned the data on account_id to ensure that all records belonging to an account go to the same Kinesis shard and order is maintained. While building a custom consumer using the Kinesis Java SDK, the data analyst notices that, sometimes, the messages arrive out of order for account_id. Upon further investigation, the data analyst discovers the messages that are out of order seem to be arriving from different shards for the same account_id and are seen when a stream resize runs.

What is an explanation for this behavior and what is the solution?

Options:

A.

There are multiple shards in a stream and order needs to be maintained in the shard. The data analyst

needs to make sure there is only a single shard in the stream and no stream resize runs.

B.

The hash key generation process for the records is not working correctly. The data analyst should generate an explicit hash key on the producer side so the records are directed to the appropriate shard accurately.

C.

The records are not being received by Kinesis Data Streams in order. The producer should use the PutRecords API call instead of the PutRecord API call with the SequenceNumberForOrdering parameter.

D.

The consumer is not processing the parent shard completely before processing the child shards after a stream resize. The data analyst should process the parent shard completely first before processing the child shards.

Discussion
Page: 10 / 14

DAS-C01
PDF

$36.75  $104.99

DAS-C01 Testing Engine

$43.75  $124.99

DAS-C01 PDF + Testing Engine

$57.75  $164.99