GCP PCA · Question 48 · Domain 5: Managing Implementation and Ensuring Solution and Operations Reliability
Your company is running a large Hadoop cluster on-premises and wants to migrate to Google Cloud. The workloads consist of ephemeral Spark jobs that run for 2 hours every night to process logs. Cost optimization is a primary concern. Which TWO GCP services/features should you use? (Select TWO)
Answer options:
Cloud Dataproc
Cloud Dataflow
Spot VMs (Preemptible VMs)
Committed Use Discounts (CUDs)
Compute Engine with custom Hadoop installation
50 questions · hints · full answers · grading