New Year Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: get65

Amazon Web Services Updated DOP-C02 Exam Questions and Answers by emil

Page: 10 / 18

Amazon Web Services DOP-C02 Exam Overview :

Exam Name: AWS Certified DevOps Engineer - Professional
Exam Code: DOP-C02 Dumps
Vendor: Amazon Web Services Certification: AWS Certified Professional
Questions: 250 Q&A's Shared By: emil
Question 40

A company has an application and a CI/CD pipeline. The CI/CD pipeline consists of an AWS CodePipeline pipeline and an AWS CodeBuild project. The CodeBuild project runs tests against the application as part of the build process and outputs a test report. The company must keep the test reports for 90 days.

Which solution will meet these requirements?

Options:

A.

Add a new stage in the CodePipeline pipeline after the stage that contains the CodeBuild project. Create an Amazon S3 bucket to store the reports. Configure an S3 deploy action type in the new CodePipeline stage with the appropriate path and format for the reports.

B.

Add a report group in the CodeBuild project buildspec file with the appropriate path and format for the reports. Create an Amazon S3 bucket to store the reports. Configure an Amazon EventBridge rule that invokes an AWS Lambda function to copy the reports to the S3 bucket when a build is completed. Create an S3 Lifecycle rule to expire the objects after 90 days.

C.

Add a new stage in the CodePipeline pipeline. Configure a test action type with the appropriate path and format for the reports. Configure the report expiration time to be 90 days in the CodeBuild project buildspec file.

D.

Add a report group in the CodeBuild project buildspec file with the appropriate path and format for the reports. Create an Amazon S3 bucket to store the reports. Configure the report group as an artifact in the CodeBuild project buildspec file. Configure the S3 bucket as the artifact destination. Set the object expiration to 90 days.

Discussion
Nell
Are these dumps reliable?
Ernie Oct 10, 2024
Yes, very much so. Cramkey Dumps are created by experienced and certified professionals who have gone through the exams themselves. They understand the importance of providing accurate and relevant information to help you succeed.
Georgina
I used Cramkey Dumps to prepare for my recent exam and I have to say, they were a huge help.
Corey Oct 2, 2024
Really? How did they help you? I know these are the same questions appears in exam. I will give my try. But tell me if they also help in some training?
Amy
I passed my exam and found your dumps 100% relevant to the actual exam.
Lacey Aug 9, 2024
Yeah, definitely. I experienced the same.
Neve
Will I be able to achieve success after using these dumps?
Rohan Oct 24, 2024
Absolutely. It's a great way to increase your chances of success.
Erik
Hey, I have passed my exam using Cramkey Dumps?
Freyja Oct 17, 2024
Really, what are they? All come in your pool? Please give me more details, I am going to have access their subscription. Please brother, give me more details.
Question 41

A security review has identified that an AWS CodeBuild project is downloading a database population script from an Amazon S3 bucket using an unauthenticated request. The security team does not allow unauthenticated requests to S3 buckets for this project.

How can this issue be corrected in the MOST secure manner?

Options:

A.

Add the bucket name to the AllowedBuckets section of the CodeBuild project settings. Update the build spec to use the AWS CLI to download the database population script.

B.

Modify the S3 bucket settings to enable HTTPS basic authentication and specify a token. Update the build spec to use cURL to pass the token and download the database population script.

C.

Remove unauthenticated access from the S3 bucket with a bucket policy. Modify the service role for the CodeBuild project to include Amazon S3 access. Use the AWS CLI to download the database population script.

D.

Remove unauthenticated access from the S3 bucket with a bucket policy. Use the AWS CLI to download the database population script using an IAM access key and a secret access key.

Discussion
Question 42

A company must encrypt all AMIs that the company shares across accounts. A DevOps engineer has access to a source account where an unencrypted custom AMI has been built. The DevOps engineer also has access to a target account where an Amazon EC2 Auto Scaling group will launch EC2 instances from the AMI. The DevOps engineer must share the AMI with the target account.

The company has created an AWS Key Management Service (AWS KMS) key in the source account.

Which additional steps should the DevOps engineer perform to meet the requirements? (Choose three.)

Options:

A.

In the source account, copy the unencrypted AMI to an encrypted AMI. Specify the KMS key in the copy action.

B.

In the source account, copy the unencrypted AMI to an encrypted AMI. Specify the default Amazon Elastic Block Store (Amazon EBS) encryption key in the copy action.

C.

In the source account, create a KMS grant that delegates permissions to the Auto Scaling group service-linked role in the target account.

D.

In the source account, modify the key policy to give the target account permissions to create a grant. In the target account, create a KMS grant that delegates permissions to the Auto Scaling group service-linked role.

E.

In the source account, share the unencrypted AMI with the target account.

F.

In the source account, share the encrypted AMI with the target account.

Discussion
Question 43

A company uses an organization in AWS Organizations to manage several AWS accounts that the company's developers use. The company requires all data to be encrypted in transit.

Multiple Amazon S3 buckets that were created in developer accounts allow unencrypted connections. A DevOps engineer must enforce encryption of data in transit for all existing S3 buckets that are created in accounts in the organization.

Which solution will meet these requirements?

Options:

A.

Use AWS Cloud Formation StackSets to deploy an AWS Network Firewall firewall to each account. Route all outbound requests from the AWS environment through the firewall. Deploy a policy to block access to all outbound requests on port 80.

B.

Use AWS CloudFormation StackSets to deploy an AWS Network Firewall firewall to each account. Route all inbound requests to the AWS environment through the firewall. Deploy a policy to block access to all inbound requests on port 80.

C.

Turn on AWS Config for the organization. Deploy a conformance pack that uses the s3-bucket-ssi-requests-only managed rule and an AWS Systems Manager Automation runbook. Use a runbook that adds a bucket policy statement to deny access to an S3 bucket when the value of the aws:SecureTransport condition key is false.

D.

Turn on AWS Config for the organization. Deploy a conformance pack that uses the s3-buckot-ssl-requests-only managed rule and an AWS Systems Manager Automation runbook. Use a runbook that adds a bucket policy statement to deny access to an S3 bucket when the value of the s3:x-amz-server-side-encryption-aws-kms-key-id condition key is null.

Discussion
Page: 10 / 18
Title
Questions
Posted

DOP-C02
PDF

$36.75  $104.99

DOP-C02 Testing Engine

$43.75  $124.99

DOP-C02 PDF + Testing Engine

$57.75  $164.99