It is more faster and easier to pass the Amazon aws certified sysops administrator associate level dumps exam by using Vivid Amazon AWS Certified SysOps Administrator Associate questuins and answers. Immediate access to the Up to date aws sysops exam dumps Exam and find the same core area aws sysops exam dumps questions with professionally verified answers, then PASS your exam with a high score now.
Q121. - (Topic 1)
You are managing a legacy application Inside VPC with hard coded IP addresses in its configuration.
Which two mechanisms will allow the application to failover to new instances without the need for reconfiguration? Choose 2 answers
A. Create an ELB to reroute traffic to a failover instance
B. Create a secondary ENI that can be moved to a failover instance
C. Use Route53 health checks to fail traffic over to a failover instance
D. Assign a secondary private IP address to the primary ENIO that can be moved to a failover instance
Answer: A,D
Q122. - (Topic 2)
A user has created a queue named “myqueue” in US-East region with AWS SQS. The user’s AWS account ID is 123456789012. If the user wants to perform some action on this queue, which of the below Queue URL should he use?
A. http://sqs.us-east-1.amazonaws.com/123456789012/myqueue
B. http://sqs.amazonaws.com/123456789012/myqueue
C. http://sqs. 123456789012.us-east-1.amazonaws.com/myqueue
D. http:// 123456789012.sqs. us-east-1.amazonaws.com/myqueue
Answer: A
Explanation:
When creating a new queue in SQS, the user must provide a queue name that is unique within the scope of all queues of user’s account. If the user creates queues using both the latest WSDL and a previous version, he will have a single namespace for all his queues. Amazon SQS assigns each queue created by user an identifier called a queue URL, which includes the queue name and other components that Amazon SQS determines. Whenever the user wants to perform an action on a queue, he must provide its queue URL. The queue URL for the account id 123456789012 & queue name “myqueue” in US-East-1 region will be http:// sqs.us-east-1.amazonaws.com/123456789012/myqueue.
Q123. - (Topic 3)
A user has created an application which will be hosted on EC2. The application makes calls to DynamoDB to fetch certain data. The application is using the DynamoDB SDK to connect with from the EC2 instance. Which of the below mentioned statements is true with respect to the best practice for security in this scenario?
A. The user should attach an IAM role with DynamoDB access to the EC2 instance
B. The user should create an IAM user with DynamoDB access and use its credentials within the application to connect with DynamoDB
C. The user should create an IAM role, which has EC2 access so that it will allow deploying the application
D. The user should create an IAM user with DynamoDB and EC2 access. Attach the user with the application so that it does not use the root account credentials
Answer: A
Explanation:
With AWS IAM a user is creating an application which runs on an EC2 instance and makes requests to AWS, such as DynamoDB or S3 calls. Here it is recommended that the user should not create an IAM user and pass the user's credentials to the application or embed those credentials inside the application. Instead, the user should use roles for EC2 and give that role access to DynamoDB /S3. When the roles are attached to EC2, it will give temporary security credentials to the application hosted on that EC2, to connect with DynamoDB / S3.
Q124. - (Topic 3)
An organization has created a Queue named “modularqueue” with SQS. The organization is not performing any operations such as SendMessage, ReceiveMessage, DeleteMessage, GetQueueAttributes, SetQueueAttributes, AddPermission, and RemovePermission on the queue. What can happen in this scenario?
A. AWS SQS sends notification after 15 days for inactivity on queue
B. AWS SQS can delete queue after 30 days without notification
C. AWS SQS marks queue inactive after 30 days
D. AWS SQS notifies the user after 2 weeks and deletes the queue after 3 weeks.
Answer: B
Explanation:
Amazon SQS can delete a queue without notification if one of the following actions hasn't been performed on it for 30 consecutive days: SendMessage, ReceiveMessage, DeleteMessage, GetQueueAttributes, SetQueueAttributes, AddPermission, and RemovePermission.
Q125. - (Topic 3)
A user has setup an Auto Scaling group. The group has failed to launch a single instance for more than 24 hours. What will happen to Auto Scaling in this condition?
A. Auto Scaling will keep trying to launch the instance for 72 hours
B. Auto Scaling will suspend the scaling process
C. Auto Scaling will start an instance in a separate region
D. The Auto Scaling group will be terminated automatically
Answer: B
Explanation:
If Auto Scaling is trying to launch an instance and if the launching of the instance fails continuously, it will suspend the processes for the Auto Scaling groups since it repeatedly failed to launch an instance. This is known as an administrative suspension. It commonly applies to the Auto Scaling group that has no running instances which is trying to launch instances for more than 24 hours, and has not succeeded in that to do so.
Q126. - (Topic 3)
A user is planning to use AWS services for his web application. If the user is trying to set up his own billing management system for AWS, how can he configure it?
A. Set up programmatic billing access. Download and parse the bill as per the requirement
B. It is not possible for the user to create his own billing management service with AWS
C. Enable the AWS CloudWatch alarm which will provide APIs to download the alarm data
D. Use AWS billing APIs to download the usage report of each service from the AWS billing console
Answer: A
Explanation:
AWS provides an option to have programmatic access to billing. Programmatic Billing Access leverages the existing Amazon Simple Storage Service (Amazon S3. APIs. Thus, the user can build applications that reference his billing data from a CSV (comma-separated value. file stored in an Amazon S3 bucket. AWS will upload the bill to the bucket every few hours and the user can download the bill CSV from the bucket, parse itand create a billing system as per the requirement.
Q127. - (Topic 2)
A user is trying to setup a recurring Auto Scaling process. The user has setup one process to scale up every day at 8 am and scale down at 7 PM. The user is trying to setup another recurring process which scales up on the 1st of every month at 8 AM and scales down the same day at 7 PM. What will Auto Scaling do in this scenario?
A. Auto Scaling will execute both processes but will add just one instance on the 1st
B. Auto Scaling will add two instances on the 1st of the month
C. Auto Scaling will schedule both the processes but execute only one process randomly
D. Auto Scaling will throw an error since there is a conflict in the schedule of two separate Auto Scaling Processes
Answer: D
Explanation:
Auto Scaling based on a schedule allows the user to scale the application in response to predictable load changes. The user can also configure the recurring schedule action which will follow the Linux cron format. As per Auto Scaling, a scheduled action must have a unique time value. If the user attempts to schedule an activity at a time when another existing activity is already scheduled, the call will be rejected with an error message noting the conflict.
Q128. - (Topic 3)
A user wants to upload a complete folder to AWS S3 using the S3 Management console. How can the user perform this activity?
A. Just drag and drop the folder using the flash tool provided by S3
B. Use the Enable Enhanced Folder option from the S3 console while uploading objects
C. The user cannot upload the whole folder in one go with the S3 management console
D. Use the Enable Enhanced Uploader option from the S3 console while uploading objects
Answer: D
Explanation:
AWS S3 provides a console to upload objects to a bucket. The user can use the file upload screen to upload the whole folder in one go by clicking on the Enable Enhanced Uploader option. When the user uploads afolder, Amazon S3 uploads all the files and subfolders from the specified folder to the user’s bucket. It then assigns a key value that is a combination of the uploaded file name and the folder name.