Google Associate Cloud Engineer Exam Questions 2024By itexambyte.com / 25 February 2024 GCP Associate Cloud Engineer Certification Exam Dumps SectionPercentage of ExamSection 1: Setting up a cloud solution environment17.50%Section 2: Planning and configuring a cloud solution17.50%Section 3: Deploying and implementing a cloud solution25%Section 4: Ensuring successful operation of a cloud solution20%Section 5: Configuring access and security20% 1 / 50 1. You receive an error message when you try to start a new VM: “You have exhausted the IP range in your subnet.” You want to resolve the error with the least amount of effort. What should you do? Create a new subnet and start your VM there. Expand the CIDR range in your subnet, and restart the VM that issued the error. Create another subnet, and move several existing VMs into the new subnet. Restart the VM using exponential backoff until the VM starts successfully. 2 / 50 2. You have a Google Cloud Function that is triggered by HTTP requests and performs a computationally intensive task. You want to ensure that the function can handle a high volume of requests without being overwhelmed. Which of the following scaling strategies should you use? Use a fixed number of instances Use automatic scaling with a minimum and maximum number of instances Use manual scaling with a load-balancer Use a Kubernetes cluster 3 / 50 3. A financial institution is required to maintain compliant and secure backup practices for its virtual machines in GCP. Which of the following options represents the correct method to create snapshots of the VM disks while adhering to compliance requirements? Schedule a script within the VM instances to periodically create and upload snapshots to an external server. Grant all employees access to the VM instances so they can manually create snapshots when needed. Rely on GCP's built-in daily automated backups without any additional configuration. Use the GCP Snapshot API to trigger automatic snapshots of the VM disks based on predefined policies. 4 / 50 4. A software development team needs to ensure that they are promptly notified of any disk space issues on their GCP virtual machines. What steps should they take to set up alerts for low disk space conditions? Manually check disk space usage on each VM instance and take action when needed. Utilize Cloud Monitoring to create alerting policies based on disk space metrics exceeding thresholds. Implement a cron job on each VM instance to monitor disk space and trigger alerts accordingly. Enable default logging in GCP to capture disk space events and send alerts automatically. 5 / 50 5. You are creating a Cloud IOT application requiring data storage of up to 10 petabytes (PB). The application must support high-speed reads and writes of small pieces of data, but your data schema is simple. You want to use the most economical solution for data storage. What should you do? Store the data in Cloud Spanner, and add an in-memory cache for speed. Store the data in Cloud Storage, and distribute the data through Cloud CDN for speed. Store the data in Cloud Bigtable, and implement the business logic in the programming language of your choice. Use BigQuery, and implement the business logic in SQL. 6 / 50 6. A company wants to monitor and analyze the activity within their Google Cloud environment for security and compliance purposes. They need to track changes made to IAM policies, resource creations, and deletions. Which Google Cloud service provides centralized access to audit logs for such activities? Stackdriver Logging Cloud Monitoring Cloud Audit Logs Access Transparency logs 7 / 50 7. As a cloud engineer, you've been tasked with upgrading the free trial account to a production-inventory system and renaming it accordingly. However, you're encountering a "permission denied" error when attempting to make these changes. Which of the following permissions is most likely to resolve the issue? Billing.accounts.update Billing.account.upgrade Billing.account.update Billing.accounts.upgrade 8 / 50 8. Your application needs to process a significant rate of transactions. The rate of transactions exceeds the processing capabilities of a single virtual machine (VM). You want to spread transactions across multiple servers in real time and in the most cost-effective manner. What should you do? Send transactions to BigQuery. On the VMs, poll for transactions that do not have the ‘processed’ key, and mark them ‘processed’ when done. Set up Cloud SQL with a memory cache for speed. On your multiple servers, poll for transactions that do not have the ‘processed’ key, and mark them ‘processed’ when done. Send transactions to Pub/Sub. Process them in VMs in a managed instance group. Record transactions in Cloud Bigtable, and poll for new transactions from the VMs 9 / 50 9. A cloud architect is designing a serverless application using Google Cloud Functions and Cloud Trace for performance monitoring. They want to set up alerts based on predefined latency thresholds to proactively detect and address performance deviations. Which Cloud Trace feature can they use to define custom latency thresholds and trigger alerts for anomalous behavior? Latency tracking policies Latency monitoring policies Performance deviation alerts Trace anomaly detection 10 / 50 10. You have been tasked with setting up a relational database solution on the Google Cloud Platform for a finance company. The database will support a limited amount of operational data in a specific geographic location. The company needs the database to be highly reliable, provide point-in-time recovery, and minimize operating costs.What should you do? Choose Cloud SQL (MySQL) and verify that the enable binary logging option is selected. Choose Cloud SQL (MySQL) and select the create failover replicas option. Choose Cloud Spanner and configure your instance with 2 nodes. Choose Cloud Spanner and set up your instance as multi-regional. 11 / 50 11. You have a directory with many subdirectories and files that you want to upload to a Google Cloud Storage bucket using the gsutil command-line tool. You want to preserve the directory structure when uploading the files. Which of the following upload methods should you use? gsutil cp -r gsutil rsync -r gsutil mv -r gsutil cp -n 12 / 50 12. A media company is looking to optimize their advertisement targeting strategy by analyzing user engagement data stored in BigQuery. They want to segment users based on their behavior and preferences. What approach should they take to perform this user segmentation in BigQuery? Utilize clustering algorithms available in BigQuery ML Export the data to a separate machine learning platform for segmentation Use SQL queries with conditional statements for segmentation Implement custom user segmentation scripts in Cloud Functions 13 / 50 13. You want to deploy a CFT (Cloud Foundation Toolkit) template that includes multiple Google Cloud resources across different regions. Which of the following features should you leverage in the CFT template to achieve this? Resource Groups Deployment Manager Deployment Manager Templates Deployment Manager Configurations 14 / 50 14. You are setting up billing for your project. You want to prevent excessive consumption of resources due to an error or malicious attack and prevent billing spikes or surprises. What should you do? Set up budgets and alerts in your project. Set up quotas for the resources that your project will be using. Set up a spending limit on the credit card used in your billing account. Label all resources according to best practices, regularly export the billing reports, and analyze them with BigQuery. 15 / 50 15. Your application allows users to upload pictures. You need to convert each picture to your internal optimized binary format and store it. You want to use the most efficient, cost-effective solution. What should you do? Store uploaded files in Cloud Bigtable, monitor Bigtable entries, and then run a Cloud Function to convert the files and store them in Bigtable. Store uploaded files in Firestore, monitor Firestore entries, and then run a Cloud Function to convert the files and store them in Firestore. Store uploaded files in Filestore, monitor Filestore entries, and then run a Cloud Function to convert the files and store them in Filestore. Save uploaded files in a Cloud Storage bucket, and monitor the bucket for uploads. Run a Cloud Function to convert the files and to store them in a Cloud Storage bucket. 16 / 50 16. You are responsible for the user-management service for your global company. The service will add, update, delete, and list addresses. Each of these operations is implemented by a Docker container microservice. The processing load can vary from low to very high. You want to deploy the service on Google Cloud for scalability and minimal administration. What should you do? Deploy your Docker containers into Cloud Run. Start each Docker container as a managed instance group. Deploy your Docker containers into Google Kubernetes Engine. Combine the four microservices into one Docker image, and deploy it to the App Engine instance. 17 / 50 17. Your team needs to directly connect your on-premises resources to several virtual machines inside a virtual private cloud (VPC). You want to provide your team with fast and secure access to the VMs with minimal maintenance and cost. What should you do? Set up Cloud Interconnect. Use Cloud VPN to create a bridge between the VPC and your network. Assign a public IP address to each VM, and assign a strong password to each one. Start a Compute Engine VM, install a software router, and create a direct tunnel to each VM. 18 / 50 18. You are migrating your workload from on-premises deployment to Google Kubernetes Engine (GKE). You want to minimize costs and stay within budget. What should you do? Configure Autopilot in GKE to monitor node utilization and eliminate idle nodes. Configure the needed capacity; the sustained use discount will make you stay within budget. Scale individual nodes up and down with the Horizontal Pod Autoscaler. Create several nodes using Compute Engine, add them to a managed instance group, and set the group to scale up and down depending on load. 19 / 50 19. As a system administrator overseeing numerous GCP projects, your task involves automating the deployment of compute instances across these projects through command-line tools. In this scenario, which specific tool or utility would be most appropriate for fulfilling this objective? Google Cloud Shell Google Cloud Console Google Cloud SDK (gcloud) Google Cloud Deployment Manager 20 / 50 20. Your company is experiencing slow response times for testers using the Quality Center testing tool due to 5 TB of testing data stored in the production database. To enhance database performance and alleviate the load, what steps should you take? Set up Multi-AZ Set up a read replica Scale the database instance Run the analytics query only weekends 21 / 50 21. You are managing your company’s first Google Cloud project. Project leads, developers, and internal testers will participate in the project, which includes sensitive information. You need to ensure that only specific members of the development team have access to sensitive information. You want to assign the appropriate Identity and Access Management (IAM) roles that also require the least amount of maintenance. What should you do? Assign a basic role to each user. Create groups. Assign a basic role to each group, and then assign users to groups. Create groups. Assign a Custom role to each group, including those who should have access to sensitive data. Assign users to groups. Create groups. Assign an IAM Predefined role to each group as required, including those who should have access to sensitive data. Assign users to groups. 22 / 50 22. You are implementing Cloud Storage for your organization. You need to follow your organization’s regulations. They include:1) Archive data older than one year.2) Delete data older than 5 years.3) Use standard storage for all other data.You want to implement these guidelines automatically and in the simplest manner available. What should you do? Set up Object Lifecycle management policies. Run a script daily. Copy data that is older than one year to an archival bucket, and delete five-year-old data. Run a script daily. Set storage class to ARCHIVE for data that is older than one year, and delete five-year-old data. Set up default storage class for three buckets named: STANDARD, ARCHIVE, DELETED. Use a script to move the data in the appropriate bucket when its condition matches your company guidelines. 23 / 50 23. Your team is building the development, test, and production environments for your project deployment in Google Cloud. You need to efficiently deploy and manage these environments and ensure that they are consistent. You want to follow Google-recommended practices. What should you do? Create a Cloud Shell script that uses gcloud commands to deploy the environments. Create one Terraform configuration for all environments. Parameterize the differences between environments. For each environment, create a Terraform configuration. Use them for repeated deployment. Reconcile the templates periodically. Use the Cloud Foundation Toolkit to create one deployment template that will work for all environments, and deploy with Terraform. 24 / 50 24. You are running several related applications on Compute Engine virtual machine (VM) instances. You want to follow Google-recommended practices and expose each application through a DNS name. What should you do? Use the Compute Engine internal DNS service to assign DNS names to your VM instances, and make the names known to your users. Assign each VM instance an alias IP address range, and then make the internal DNS names public. Assign Google Cloud routes to your VM instances, assign DNS names to the routes, and make the DNS names public. Use Cloud DNS to translate your domain names into your IP addresses. 25 / 50 25. You provide a service that you need to open to everyone in your partner network. You have a server and an IP address where the application is located. You do not want to have to change the IP address on your DNS server if your server crashes or is replaced. You also want to avoid downtime and deliver a solution for minimal cost and setup. What should you do? Create a script that updates the IP address for the domain when the server crashes or is replaced. Reserve a static internal IP address, and assign it using Cloud DNS. Reserve a static external IP address, and assign it using Cloud DNS. Use the Bring Your Own IP (BYOIP) method to use your own IP address. 26 / 50 26. Your project team needs to estimate the spending for your Google Cloud project for the next quarter. You know the project requirements. You want to produce your estimate as quickly as possible. What should you do? Build a simple machine learning model that will predict your next month’s spend. Estimate the number of hours of compute time required, and then multiply by the VM per-hour pricing. Use the Google Cloud Pricing Calculator to enter your predicted consumption for all groups of resources. Use the Google Cloud Pricing Calculator to enter your consumption for all groups of resources, and then adjust for volume discounts. 27 / 50 27. A multinational corporation is deploying a globally distributed application on Google Cloud Run to serve users in different regions. They want to optimize latency by directing users to the nearest available instance of the application. Which feature of Google Cloud Run can help achieve this goal? Anycast IP addresses for geo-proximity routing Traffic splitting for A/B testing across regions Multi-regional deployment with global load balancing Custom domain mapping with CDN integration 28 / 50 28. A startup company is launching a new mobile app that requires real-time updates and notifications for users based on their geographic location. They want to resolve DNS queries dynamically to direct users to the nearest app server for optimal performance. Which Cloud DNS feature can help achieve this dynamic routing based on client location? Global load balancing Geo location based routing Region based routing Geofenced routing 29 / 50 29. A project team is deploying a machine learning model on Google Cloud AI Platform that requires access to Google Cloud Storage for reading training data and writing model checkpoints. They need to securely authenticate the model with the storage bucket. What is the recommended practice for managing credentials in this scenario? Embed service account key directly in the model code Use OAuth 2.0 tokens for authentication Create a service account and grant appropriate permissions Store sensitive keys in plaintext configuration files 30 / 50 30. You work in a small company where everyone should be able to view all resources of a specific project. You want to grant them access following Google’s recommended practices. What should you do? Create a new Google Group and add all users to the group. Use “gcloud projects add-iam-policy-binding” with the Project Viewer role and Group email address. Create a script that uses “gcloud projects add-iam-policy-binding” for all users’ email addresses and the Project Viewer role. Create a script that uses “gcloud iam roles create” for all users’ email addresses and the Project Viewer role. Create a new Google Group and add all members to the group. Use “gcloud iam roles create” with the Project Viewer role and Group email address. 31 / 50 31. An organization wants to allocate separate budgets for development, testing, and production environments within the same GCP project. How can they achieve this? Create separate billing accounts for each environment Use labels to categorize resources and set up budget alerts based on labels Assign billing administrators for each environment Configure budget alerts for the entire project and rely on email notifications 32 / 50 32. You have an instance template with a web application and need to deploy it to scale based on HTTP traffic. What steps should you take? Create a VM from the instance template, then create a custom image from the VM's disk, export the image to Cloud Storage, and set up an HTTP load balancer with the Cloud Storage bucket as its backend service. Create a VM from the instance template and set up an App Engine application in Automatic Scaling mode to route all traffic to the VM. Set up a managed instance group using the instance template, configure autoscaling based on HTTP traffic, and designate the instance group as the backend service for an HTTP load balancer. Provision the required number of instances for peak user traffic using the instance template, create an unmanaged instance group, add the instances to the group, and configure the group as the Backend Service for an HTTP load balancer. 33 / 50 33. A development team is working on a new project that requires access to BigQuery datasets for running ad-hoc queries and creating new tables, but they should not have the ability to delete existing datasets or tables. Which approach should be used to assign the least privilege necessary to the developers while adhering to the principle of least privilege? Assign the predefined role of BigQuery Data Viewer Create a custom role with the necessary permissions Grant project-level roles with read-only access to BigQuery Use IAM policy hierarchy to limit access based on resource location 34 / 50 34. A company is developing a new application that needs to access various Google Cloud services programmatically. They want to follow the principle of least privilege and ensure secure communication between components. Which approach should they take to grant only the necessary permissions to the application while maintaining security? Use a user account with elevated privileges Share project-level credentials for simplicity Create a dedicated service account with custom roles Assign owner roles to the application for flexibility 35 / 50 35. A software development team is collaborating across different departments and needs to share files securely between Cloud Storage buckets. What is the recommended method to achieve this? Share the object URL through email or instant messaging. Use the gsutil rsync command to sync objects between the two buckets. Use the gsutil cp command to copy the objects from one bucket to another. Generate a signed URL with a limited time window for access to the requested objects. 36 / 50 36. You are migrating your on-premises solution to Google Cloud. As a first step, the new cloud solution will need to ingest 100 TB of data. Your daily uploads will be within your current bandwidth limit of 100 Mbps. You want to follow Google-recommended practices for the most cost-effective way to implement the migration. What should you do? Set up Partner Interconnect for the duration of the first upload. Obtain a Transfer Appliance, copy the data to it, and ship it to Google. Set up Dedicated Interconnect for the duration of your first upload, and then drop back to regular bandwidth. Divide your data between 100 computers, and upload each data portion to a bucket. Then run a script to merge the uploads together. 37 / 50 37. A large organization with multiple projects and service accounts in Google Cloud needs to enable a specific service account to impersonate another service account across different projects for seamless access control. Which GCP feature allows for cross-project service account impersonation while maintaining strong security controls? Identity and Access Management (IAM) roles Credential Access Management (CAM) Impersonation Service API Service Account Delegation 38 / 50 38. As you embark on a client's project, they require a horizontally scalable database service within Google Cloud capable of accommodating gigabyte-sized relational data while also supporting ACID (Atomicity, Consistency, Isolation, and Durability) for reliable data storage. Which service would you recommend? Datastore BigQuery CloudSQL Cloud Spanner 39 / 50 39. You are creating an environment for researchers to run ad hoc SQL queries. The researchers work with large quantities of data. Although they will use the environment for an hour a day on average, the researchers need access to the functional environment at any time during the day. You need to deliver a cost-effective solution. What should you do? Store the data in Cloud Bigtable, and run SQL queries provided by Bigtable schema. Store the data in BigQuery, and run SQL queries in BigQuery. Create a Dataproc cluster, store the data in HDFS storage, and run SQL queries in Spark. Create a Dataproc cluster, store the data in Cloud Storage, and run SQL queries in Spark. 40 / 50 40. You have created a Kubernetes deployment on Google Kubernetes Engine (GKE) that has a backend service. You also have pods that run the frontend service. You want to ensure that there is no interruption in communication between your frontend and backend service pods if they are moved or restarted. What should you do? Create a service that groups your pods in the backend service, and tell your frontend pods to communicate through that service. Create a DNS entry with a fixed IP address that the frontend service can use to reach the backend service. Assign static internal IP addresses that the frontend service can use to reach the backend pods. Assign static external IP addresses that the frontend service can use to reach the backend pods. 41 / 50 41. Your task is to establish a storage policy for a designated Cloud Storage Regional bucket that houses CCTV videos for your company. The requirement is to transfer the files to Coldline storage after 3 months (90 days) and subsequently delete them automatically after a year from their creation date.What specific policy should you establish? Specify Data Lifecycle Management conditions on the Cloud Storage bucket then configure the SetStorageClass action to 90 days and configure the Delete action to 365 days. Specify Object Lifecycle Management conditions on the Cloud Storage bucket then configure the SetStorageClass action to 90 days and configure the Delete action to 365 days. Utilize the gsutil tool on Cloud Shell and execute the gsutil rewrite command then set the Delete action to 275 days. Utilize the gsutil tool on Cloud Shell and execute the gsutil rewrite command then set the Delete action to 365 days. 42 / 50 42. A development team is troubleshooting an incident related to unexpected resource modifications in their GCP project. They need to review the log entries specifically related to changes made to Google Cloud Storage buckets over the last 24 hours. Which approach should they take to access and filter the relevant audit log data? Use custom log filters in Stackdriver Logging Query the Cloud Monitoring API for storage activity logs Access the Cloud Data Loss Prevention dashboard for audit logs Review the audit logs directly from the Cloud Storage console 43 / 50 43. Your organization plans to migrate its financial transaction monitoring application to Google Cloud. Auditors need to view the data and run reports in BigQuery, but they are not allowed to perform transactions in the application. You are leading the migration and want the simplest solution that will require the least amount of maintenance. What should you do? Assign roles/bigquery.dataViewer to the individual auditors. Create a group for auditors and assign roles/viewer to them. Create a group for auditors, and assign roles/bigquery.dataViewer to them. Assign a custom role to each auditor that allows view-only access to BigQuery. 44 / 50 44. You are responsible for monitoring all changes in your Cloud Storage and Firestore instances. For each change, you need to invoke an action that will verify the compliance of the change in near real time. You want to accomplish this with minimal setup. What should you do? Use the trigger mechanism in each datastore to invoke the security script. Use Cloud Function events, and call the security script from the Cloud Function triggers. Use a Python script to get logs of the datastores, analyze them, and invoke the security script. Redirect your data-changing queries to an App Engine application, and call the security script from the application. 45 / 50 45. In preparation for migrating on-premise servers and data to Google Cloud gradually, your company needs to establish a VPN tunnel between the on-premise infrastructure and Google Cloud. To facilitate a seamless setup, which additional service would you use alongside Cloud VPN? Cloud CDN Cloud NAT Cloud Run Cloud Router 46 / 50 46. A healthcare organization is using Google Cloud Storage to store patient records. They want to ensure the integrity and security of these records by implementing object change notification events. Which of the following options demonstrates a suitable use case in this scenario? Trigger an automatic backup of the patient records to a secondary storage location whenever a file is modified. Notify all patients via email whenever a file is accessed by a healthcare provider. Delete the patient records immediately after any modification is detected. Encrypt the patient records with a new key every time they are accessed. 47 / 50 47. You are distributing traffic among a fleet of VMs in your VPC using an Internal TCP/UDP Load Balancer. Among the specifications listed, which one is not supported by this particular load balancing type? Preserved Client IP Global Availability Internal Load Balancing Any Destination Ports 48 / 50 48. A startup company is developing an application with varying storage performance requirements across different zones within a region. They want to optimize costs while ensuring performance. Which disk type should they consider for their deployment? Zonal persistent disk Regional balanced persistent disk SSD persistent disk Standard persistent disk 49 / 50 49. You are charged with optimizing Google Cloud resource consumption. Specifically, you need to investigate the resource consumption charges and present a summary of your findings. You want to do it in the most efficient way possible. What should you do? Rename resources to reflect the owner and purpose. Write a Python script to analyze resource consumption. Attach labels to resources to reflect the owner and purpose. Export Cloud Billing data into BigQuery, and analyze it with Data Studio. Assign tags to resources to reflect the owner and purpose. Export Cloud Billing data into BigQuery, and analyze it with Data Studio. Create a script to analyze resource usage based on the project to which the resources belong. In this script, use the IAM accounts and services accounts that control given resources. 50 / 50 50. You want to deploy a Cloud Marketplace solution that requires access to private services in a VPC network. Which of the following options should you choose to ensure secure and private access to the services? Use a public IP address for the solution Create a VPN connection between the VPC network and the on-premises network Use Cloud NAT to provide egress traffic for the solution Deploy the solution in a subnet with a VPC Network Peering connection Your score is The average score is 28% LinkedIn Facebook Twitter 0% Restart quiz Exit