Amazon-Web-Services DBS-C01 ExamAWS Certified Database - Specialty

Total Question: 85 Last Updated: Oct 16,2020
  • Updated DBS-C01 Dumps
  • Based on Real DBS-C01 Exams Scenarios
  • Free DBS-C01 pdf Demo Available
  • Check out our DBS-C01 Dumps in a new PDF format
  • Instant DBS-C01 download
  • Guarantee DBS-C01 success in first attempt
Package Select:

Questions & Answers PDF

Practice Test Software

Practice Test + PDF 30% Discount

Price: $85.95 $39.99

Buy Now Free Trial

A Review Of Accurate DBS-C01 Vce

Want to know Certleader DBS-C01 Exam practice test features? Want to lear more about Amazon-Web-Services AWS Certified Database - Specialty certification experience? Study Virtual Amazon-Web-Services DBS-C01 answers to Most recent DBS-C01 questions at Certleader. Gat a success with an absolute guarantee to pass Amazon-Web-Services DBS-C01 (AWS Certified Database - Specialty) test on your first attempt.

Check DBS-C01 free dumps before getting the full version:

NEW QUESTION 1
A company is writing a new survey application to be used with a weekly televised game show. The application will be available for 2 hours each week. The company expects to receive over 500,000 entries every week, with each survey asking 2-3 multiple choice questions of each user. A Database Specialist needs to select a platform that is highly scalable for a large number of concurrent writes to handle he anticipated volume.
Which AWS services should the Database Specialist consider? (Choose two.)

  • A. Amazon DynamoDB
  • B. Amazon Redshift
  • C. Amazon Neptune
  • D. Amazon Elasticsearch Service
  • E. Amazon ElastiCache

Answer: AE

NEW QUESTION 2
A financial services company is developing a shared data service that supports different applications from throughout the company. A Database Specialist designed a solution to leverage Amazon ElastiCache for Redis with cluster mode enabled to enhance performance and scalability. The cluster is configured to listen on port 6379.
Which combination of steps should the Database Specialist take to secure the cache data and protect it from unauthorized access? (Choose three.)

  • A. Enable in-transit and at-rest encryption on the ElastiCache cluster.
  • B. Ensure that Amazon CloudWatch metrics are configured in the ElastiCache cluster.
  • C. Ensure the security group for the ElastiCache cluster allows all inbound traffic from itself and inbound traffic on TCP port 6379 from trusted clients only.
  • D. Create an IAM policy to allow the application service roles to access all ElastiCache API actions.
  • E. Ensure the security group for the ElastiCache clients authorize inbound TCP port 6379 and port 22 traffic from the trusted ElastiCache cluster’s security group.
  • F. Ensure the cluster is created with the auth-token parameter and that the parameter is used in all subsequent commands.

Answer: ABE

NEW QUESTION 3
A company has an Amazon RDS Multi-AZ DB instances that is 200 GB in size with an RPO of 6 hours. To meet the company’s disaster recovery policies, the database backup needs to be copied into another Region. The company requires the solution to be cost-effective and operationally efficient.
What should a Database Specialist do to copy the database backup into a different Region?

  • A. Use Amazon RDS automated snapshots and use AWS Lambda to copy the snapshot into another Region
  • B. Use Amazon RDS automated snapshots every 6 hours and use Amazon S3 cross-Region replication tocopy the snapshot into another Region
  • C. Create an AWS Lambda function to take an Amazon RDS snapshot every 6 hours and use a secondLambda function to copy the snapshot into another Region
  • D. Create a cross-Region read replica for Amazon RDS in another Region and take an automated snapshot ofthe read replica

Answer: D

NEW QUESTION 4
An AWS CloudFormation stack that included an Amazon RDS DB instance was accidentally deleted and recent data was lost. A Database Specialist needs to add RDS settings to the CloudFormation template to reduce the chance of accidental instance data loss in the future.
Which settings will meet this requirement? (Choose three.)

  • A. Set DeletionProtection to True
  • B. Set MultiAZ to True
  • C. Set TerminationProtection to True
  • D. Set DeleteAutomatedBackups to False
  • E. Set DeletionPolicy to Delete
  • F. Set DeletionPolicy to Retain

Answer: ACF

NEW QUESTION 5
A gaming company is designing a mobile gaming app that will be accessed by many users across the globe. The company wants to have replication and full support for multi-master writes. The company also wants to ensure low latency and consistent performance for app users.
Which solution meets these requirements?

  • A. Use Amazon DynamoDB global tables for storage and enable DynamoDB automatic scaling
  • B. Use Amazon Aurora for storage and enable cross-Region Aurora Replicas
  • C. Use Amazon Aurora for storage and cache the user content with Amazon ElastiCache
  • D. Use Amazon Neptune for storage

Answer: A

NEW QUESTION 6
A clothing company uses a custom ecommerce application and a PostgreSQL database to sell clothes to thousands of users from multiple countries. The company is migrating its application and database from its on premises data center to the AWS Cloud. The company has selected Amazon EC2 for the application and Amazon RDS for PostgreSQL for the database. The company requires database passwords to be changed every 60 days. A Database Specialist needs to ensure that the credentials used by the web application to connect to the database are managed securely.
Which approach should the Database Specialist take to securely manage the database credentials?

  • A. Store the credentials in a text file in an Amazon S3 bucke
  • B. Restrict permissions on the bucket to the IAM role associated with the instance profile onl
  • C. Modify the application to download the text file and retrieve the credentials on start u
  • D. Update the text file every 60 days.
  • E. Configure IAM database authentication for the application to connect to the databas
  • F. Create an IAM user and map it to a separate database user for each ecommerce use
  • G. Require users to update their passwords every 60 days.
  • H. Store the credentials in AWS Secrets Manage
  • I. Restrict permissions on the secret to only the IAM role associated with the instance profil
  • J. Modify the application to retrieve the credentials from Secrets Manager on start u
  • K. Configure the rotation interval to 60 days.
  • L. Store the credentials in an encrypted text file in the application AM
  • M. Use AWS KMS to store the key fordecrypting the text fil
  • N. Modify the application to decrypt the text file and retrieve the credentials on start u
  • O. Update the text file and publish a new AMI every 60 days.

Answer: B

NEW QUESTION 7
A gaming company wants to deploy a game in multiple Regions. The company plans to save local high scores in Amazon DynamoDB tables in each Region. A Database Specialist needs to design a solution to automate the deployment of the database with identical configurations in additional Regions, as needed. The solution should also automate configuration changes across all Regions.
Which solution would meet these requirements and deploy the DynamoDB tables?

  • A. Create an AWS CLI command to deploy the DynamoDB table to all the Regions and save it for future deployments.
  • B. Create an AWS CloudFormation template and deploy the template to all the Regions.
  • C. Create an AWS CloudFormation template and use a stack set to deploy the template to all the Regions.
  • D. Create DynamoDB tables using the AWS Management Console in all the Regions and create a step-bystep guide for future deployments.

Answer: B

NEW QUESTION 8
A gaming company has recently acquired a successful iOS game, which is particularly popular during theholiday season. The company has decided to add a leaderboard to the game that uses Amazon DynamoDB.The application load is expected to ramp up over the holiday season.
Which solution will meet these requirements at the lowest cost?

  • A. DynamoDB Streams
  • B. DynamoDB with DynamoDB Accelerator
  • C. DynamoDB with on-demand capacity mode
  • D. DynamoDB with provisioned capacity mode with Auto Scaling

Answer: C

NEW QUESTION 9
A company is concerned about the cost of a large-scale, transactional application using Amazon DynamoDB that only needs to store data for 2 days before it is deleted. In looking at the tables, a Database Specialist notices that much of the data is months old, and goes back to when the application was first deployed.
What can the Database Specialist do to reduce the overall cost?

  • A. Create a new attribute in each table to track the expiration time and create an AWS Glue transformation to delete entries more than 2 days old.
  • B. Create a new attribute in each table to track the expiration time and enable DynamoDB Streams on each table.
  • C. Create a new attribute in each table to track the expiration time and enable time to live (TTL) on each table.
  • D. Create an Amazon CloudWatch Events event to export the data to Amazon S3 daily using AWS Data Pipeline and then truncate the Amazon DynamoDB table.

Answer: A

NEW QUESTION 10
A company is running a finance application on an Amazon RDS for MySQL DB instance. The application is governed by multiple financial regulatory agencies. The RDS DB instance is set up with security groups to allow access to certain Amazon EC2 servers only. AWS KMS is used for encryption at rest.
Which step will provide additional security?

  • A. Set up NACLs that allow the entire EC2 subnet to access the DB instance
  • B. Disable the master user account
  • C. Set up a security group that blocks SSH to the DB instance
  • D. Set up RDS to use SSL for data in transit

Answer: D

NEW QUESTION 11
A marketing company is using Amazon DocumentDB and requires that database audit logs be enabled. A Database Specialist needs to configure monitoring so that all data definition language (DDL) statements performed are visible to the Administrator. The Database Specialist has set the audit_logs parameter to enabled in the cluster parameter group.
What should the Database Specialist do to automatically collect the database logs for the Administrator?

  • A. Enable DocumentDB to export the logs to Amazon CloudWatch Logs
  • B. Enable DocumentDB to export the logs to AWS CloudTrail
  • C. Enable DocumentDB Events to export the logs to Amazon CloudWatch Logs
  • D. Configure an AWS Lambda function to download the logs using the download-db-log-file-portion operationand store the logs in Amazon S3

Answer: A

NEW QUESTION 12
A company is using Amazon RDS for MySQL to redesign its business application. A Database Specialist has noticed that the Development team is restoring their MySQL database multiple times a day when Developers make mistakes in their schema updates. The Developers sometimes need to wait hours to the restores to complete.
Multiple team members are working on the project, making it difficult to find the correct restore point for each mistake.
Which approach should the Database Specialist take to reduce downtime?

  • A. Deploy multiple read replicas and have the team members make changes to separate replica instances
  • B. Migrate to Amazon RDS for SQL Server, take a snapshot, and restore from the snapshot
  • C. Migrate to Amazon Aurora MySQL and enable the Aurora Backtrack feature
  • D. Enable the Amazon RDS for MySQL Backtrack feature

Answer: A

NEW QUESTION 13
An ecommerce company has tasked a Database Specialist with creating a reporting dashboard that visualizes critical business metrics that will be pulled from the core production database running on Amazon Aurora. Data that is read by the dashboard should be available within 100 milliseconds of an update.
The Database Specialist needs to review the current configuration of the Aurora DB cluster and develop a cost-effective solution. The solution needs to accommodate the unpredictable read workload from the reporting dashboard without any impact on the write availability and performance of the DB cluster.
Which solution meets these requirements?

  • A. Turn on the serverless option in the DB cluster so it can automatically scale based on demand.
  • B. Provision a clone of the existing DB cluster for the new Application team.
  • C. Create a separate DB cluster for the new workload, refresh from the source DB cluster, and set up ongoingreplication using AWS DMS change data capture (CDC).
  • D. Add an automatic scaling policy to the DB cluster to add Aurora Replicas to the cluster based on CPUconsumption.

Answer: A

NEW QUESTION 14
A company is looking to migrate a 1 TB Oracle database from on-premises to an Amazon Aurora PostgreSQL DB cluster. The company’s Database Specialist discovered that the Oracle database is storing 100 GB of large binary objects (LOBs) across multiple tables. The Oracle database has a maximum LOB size of 500 MB with an average LOB size of 350 MB. The Database Specialist has chosen AWS DMS to migrate the data with the largest replication instances.
How should the Database Specialist optimize the database migration using AWS DMS?

  • A. Create a single task using full LOB mode with a LOB chunk size of 500 MB to migrate the data and LOBstogether
  • B. Create two tasks: task1 with LOB tables using full LOB mode with a LOB chunk size of 500 MB and task2without LOBs
  • C. Create two tasks: task1 with LOB tables using limited LOB mode with a maximum LOB size of 500 MB andtask 2 without LOBs
  • D. Create a single task using limited LOB mode with a maximum LOB size of 500 MB to migrate data andLOBs together

Answer: C

NEW QUESTION 15
A company has multiple applications serving data from a secure on-premises database. The company is migrating all applications and databases to the AWS Cloud. The IT Risk and Compliance department requires that auditing be enabled on all secure databases to capture all log ins, log outs, failed logins, permission changes, and database schema changes. A Database Specialist has recommended Amazon Aurora MySQL as the migration target, and leveraging the Advanced Auditing feature in Aurora.
Which events need to be specified in the Advanced Auditing configuration to satisfy the minimum auditing requirements? (Choose three.)

  • A. CONNECT
  • B. QUERY_DCL
  • C. QUERY_DDL
  • D. QUERY_DML
  • E. TABLE
  • F. QUERY

Answer: ACE

NEW QUESTION 16
A gaming company has implemented a leaderboard in AWS using a Sorted Set data structure within Amazon ElastiCache for Redis. The ElastiCache cluster has been deployed with cluster mode disabled and has a replication group deployed with two additional replicas. The company is planning for a worldwide gaming event and is anticipating a higher write load than what the current cluster can handle.
Which method should a Database Specialist use to scale the ElastiCache cluster ahead of the upcoming event?

  • A. Enable cluster mode on the existing ElastiCache cluster and configure separate shards for the Sorted Setacross all nodes in the cluster.
  • B. Increase the size of the ElastiCache cluster nodes to a larger instance size.
  • C. Create an additional ElastiCache cluster and load-balance traffic between the two clusters.
  • D. Use the EXPIRE command and set a higher time to live (TTL) after each call to increment a given key.

Answer: B

NEW QUESTION 17
A Database Specialist is migrating a 2 TB Amazon RDS for Oracle DB instance to an RDS for PostgreSQL DB instance using AWS DMS. The source RDS Oracle DB instance is in a VPC in the us-east-1 Region. The target RDS for PostgreSQL DB instance is in a VPC in the use-west-2 Region.
Where should the AWS DMS replication instance be placed for the MOST optimal performance?

  • A. In the same Region and VPC of the source DB instance
  • B. In the same Region and VPC as the target DB instance
  • C. In the same VPC and Availability Zone as the target DB instance
  • D. In the same VPC and Availability Zone as the source DB instance

Answer: D

NEW QUESTION 18
A company wants to migrate its existing on-premises Oracle database to Amazon Aurora PostgreSQL. The migration must be completed with minimal downtime using AWS DMS. A Database Specialist must validate that the data was migrated accurately from the source to the target before the cutover. The migration must have minimal impact on the performance of the source database.
Which approach will MOST effectively meet these requirements?

  • A. Use the AWS Schema Conversion Tool (AWS SCT) to convert source Oracle database schemas to the target Aurora DB cluste
  • B. Verify the datatype of the columns.
  • C. Use the table metrics of the AWS DMS task created for migrating the data to verify the statistics for the tables being migrated and to verify that the data definition language (DDL) statements are completed.
  • D. Enable the AWS Schema Conversion Tool (AWS SCT) premigration validation and review the premigrationchecklist to make sure there are no issues with the conversion.
  • E. Enable AWS DMS data validation on the task so the AWS DMS task compares the source and targetrecords, and reports any mismatches.

Answer: D

NEW QUESTION 19
A company with branch offices in Portland, New York, and Singapore has a three-tier web application that leverages a shared database. The database runs on Amazon RDS for MySQL and is hosted in the us-west-2 Region. The application has a distributed front end deployed in the us-west-2, ap-southheast-1, and us-east-2 Regions.
This front end is used as a dashboard for Sales Managers in each branch office to see current sales statistics. There are complaints that the dashboard performs more slowly in the Singapore location than it does in Portland or New York. A solution is needed to provide consistent performance for all users in each location.
Which set of actions will meet these requirements?

  • A. Take a snapshot of the instance in the us-west-2 Regio
  • B. Create a new instance from the snapshot in the ap-southeast-1 Regio
  • C. Reconfigure the ap-southeast-1 front-end dashboard to access this instance.
  • D. Create an RDS read replica in the ap-southeast-1 Region from the primary RDS DB instance in the uswest- 2 Regio
  • E. Reconfigure the ap-southeast-1 front-end dashboard to access this instance.
  • F. Create a new RDS instance in the ap-southeast-1 Regio
  • G. Use AWS DMS and change data capture (CDC) to update the new instance in the ap-southeast-1 Regio
  • H. Reconfigure the ap-southeast-1 front-end dashboard to access this instance.
  • I. Create an RDS read replica in the us-west-2 Region where the primary instance reside
  • J. Create a read replica in the ap-southeast-1 Region from the read replica located on the us-west-2 Regio
  • K. Reconfigure the ap-southeast-1 front-end dashboard to access this instance.

Answer: A

NEW QUESTION 20
A retail company is about to migrate its online and mobile store to AWS. The company’s CEO has strategic plans to grow the brand globally. A Database Specialist has been challenged to provide predictable read and write database performance with minimal operational overhead.
What should the Database Specialist do to meet these requirements?

  • A. Use Amazon DynamoDB global tables to synchronize transactions
  • B. Use Amazon EMR to copy the orders table data across Regions
  • C. Use Amazon Aurora Global Database to synchronize all transactions
  • D. Use Amazon DynamoDB Streams to replicate all DynamoDB transactions and sync them

Answer: A

NEW QUESTION 21
A company has a production Amazon Aurora Db cluster that serves both online transaction processing (OLTP) transactions and compute-intensive reports. The reports run for 10% of the total cluster uptime while the OLTP transactions run all the time. The company has benchmarked its workload and determined that a six-node Aurora DB cluster is appropriate for the peak workload.
The company is now looking at cutting costs for this DB cluster, but needs to have a sufficient number of nodes in the cluster to support the workload at different times. The workload has not changed since the previous benchmarking exercise.
How can a Database Specialist address these requirements with minimal user involvement?

  • A. Split up the DB cluster into two different clusters: one for OLTP and the other for reportin
  • B. Monitor and set up replication between the two clusters to keep data consistent.
  • C. Review all evaluate the peak combined workloa
  • D. Ensure that utilization of the DB cluster node is at an acceptable leve
  • E. Adjust the number of instances, if necessary.
  • F. Use the stop cluster functionality to stop all the nodes of the DB cluster during times of minimal workloa
  • G. The cluster can be restarted again depending on the workload at the time.
  • H. Set up automatic scaling on the DB cluste
  • I. This will allow the number of reader nodes to adjust automatically to the reporting workload, when needed.

Answer: D

NEW QUESTION 22
......

Recommend!! Get the Full DBS-C01 dumps in VCE and PDF From Dumps-hub.com, Welcome to Download: https://www.dumps-hub.com/DBS-C01-dumps.html (New 85 Q&As Version)