Black Friday Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DAS-C01 Exam Questions and Answers by edwin

Page: 11 / 14

Amazon Web Services DAS-C01 Exam Overview :

Exam Name: AWS Certified Data Analytics - Specialty
Exam Code: DAS-C01 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Data Analytics
Questions: 207 Q&A's Shared By: edwin
Question 44

A manufacturing company has been collecting IoT sensor data from devices on its factory floor for a year and is storing the data in Amazon Redshift for daily analysis. A data analyst has determined that, at an expected ingestion rate of about 2 TB per day, the cluster will be undersized in less than 4 months. A long-term solution is needed. The data analyst has indicated that most queries only reference the most recent 13 months of data, yet there are also quarterly reports that need to query all the data generated from the past 7 years. The chief technology officer (CTO) is concerned about the costs, administrative effort, and performance of a long-term solution.

Which solution should the data analyst use to meet these requirements?

Options:

A.

Create a daily job in AWS Glue to UNLOAD records older than 13 months to Amazon S3 and delete those records from Amazon Redshift. Create an external table in Amazon Redshift to point to the S3 location. Use Amazon Redshift Spectrum to join to data that is older than 13 months.

B.

Take a snapshot of the Amazon Redshift cluster. Restore the cluster to a new cluster using dense storage nodes with additional storage capacity.

C.

Execute a CREATE TABLE AS SELECT (CTAS) statement to move records that are older than 13 months to quarterly partitioned data in Amazon Redshift Spectrum backed by Amazon S3.

D.

Unload all the tables in Amazon Redshift to an Amazon S3 bucket using S3 Intelligent-Tiering. Use AWS Glue to crawl the S3 bucket location to create external tables in an AWS Glue Data Catalog. Create an Amazon EMR cluster using Auto Scaling for any daily analytics needs, and use Amazon Athena for the quarterly reports, with both using the same AWS Glue Data Catalog.

Discussion
Question 45

A bank is building an Amazon S3 data lake. The bank wants a single data repository for customer data needs, such as personalized recommendations. The bank needs to use Amazon Kinesis Data Firehose to ingest customers' personal information, bank accounts, and transactions in near real time from a transactional relational database.

All personally identifiable information (Pll) that is stored in the S3 bucket must be masked. The bank has enabled versioning for the S3 bucket.

Which solution will meet these requirements?

Options:

A.

Invoke an AWS Lambda function from Kinesis Data Firehose to mask the PII before Kinesis Data Firehose delivers the data to the S3 bucket.

B.

Use Amazon Macie to scan the S3 bucket. Configure Macie to discover Pll. Invoke an AWS Lambda function from S3 events to mask the Pll.

C.

Configure server-side encryption (SSE) for the S3 bucket. Invoke an AWS Lambda function from S3 events to mask the PII.

D.

Create an AWS Lambda function to read the objects, mask the Pll, and store the objects back with same key. Invoke the Lambda function from S3 events.

Discussion
Question 46

A company using Amazon QuickSight Enterprise edition has thousands of dashboards analyses and datasets. The company struggles to manage and assign permissions for granting users access to various items within QuickSight. The company wants to make it easier to implement sharing and permissions management.

Which solution should the company implement to simplify permissions management?

Options:

A.

Use QuickSight folders to organize dashboards, analyses, and datasets Assign individual users permissions to these folders

B.

Use QuickSight folders to organize dashboards analyses, and datasets Assign group permissions by using these folders.

C.

Use AWS 1AM resource-based policies to assign group permissions to QuickSight items

D.

Use QuickSight user management APIs to provision group permissions based on dashboard naming conventions

Discussion
Victoria
Hey, guess what? I passed the certification exam! I couldn't have done it without Cramkey Dumps.
Isabel Sep 21, 2024
Same here! I was so surprised when I saw that almost all the questions on the exam were exactly what I found in their study materials.
Andrew
Are these dumps helpful?
Jeremiah Oct 27, 2024
Yes, Don’t worry!!! I'm confident you'll find them to be just as helpful as I did. Good luck with your exam!
Kingsley
Do anyone guide my how these dumps would be helpful for new students like me?
Haris Sep 11, 2024
Absolutely! They are highly recommended for anyone looking to pass their certification exam. The dumps are easy to understand and follow, making it easier for you to study and retain the information.
Addison
Want to tell everybody through this platform that I passed my exam with excellent score. All credit goes to Cramkey Exam Dumps.
Libby Aug 9, 2024
That's good to know. I might check it out for my next IT certification exam. Thanks for the info.
Question 47

An IOT company is collecting data from multiple sensors and is streaming the data to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Each sensor type has

its own topic, and each topic has the same number of partitions.

The company is planning to turn on more sensors. However, the company wants to evaluate which sensor types are producing the most data sothat the company can scale

accordingly. The company needs to know which sensor types have the largest values for the following metrics: ByteslnPerSec and MessageslnPerSec.

Which level of monitoring for Amazon MSK will meet these requirements?

Options:

A.

DEFAULT level

B.

PER TOPIC PER BROKER level

C.

PER BROKER level

D.

PER TOPIC level

Discussion
Page: 11 / 14

DAS-C01
PDF

$36.75  $104.99

DAS-C01 Testing Engine

$43.75  $124.99

DAS-C01 PDF + Testing Engine

$57.75  $164.99