Medium1 markMultiple Choice
Subtask 4.1: Technical ProcessesAnalyticsPub/SubDataflowBigQuery
This question is part of a case study — click to read the full scenario(Case 01)

CASE STUDY: TechStream Gaming

Company Overview: TechStream Gaming is a global gaming company with 500 employees and $50M annual revenue.
Current Environment: On-premises data centers in US and EU. 200 servers. MySQL databases (5 TB). Peak users: 2M. Cost: $100K/mo.
Business Requirements: Reduce costs by 40%. Support 5x user growth. Launch in APAC, SA, Africa. Improve deployment to daily.
Executive Statements: CEO: 'Scale rapidly.' CFO: 'Max $100K/mo, ROI 18mo.' CTO: 'Limited cloud exp, 99.95% uptime.'
Technical Requirements: <100ms latency globally. Real-time analytics. 5x traffic spikes. EU data residency. DDoS protection. CI/CD.
Constraints: 12-month migration. Max 4-hour downtime. 20 devs (Java/MySQL), 5 ops (limited cloud). Budget $2M.

QUESTION:
Which migration strategy and compute architecture should you recommend for the game servers to meet the business and technical requirements while addressing the CTO's constraints?

GCP PCA · Question 04 · Technical Processes

CASE STUDY: TechStream Gaming

Company Overview: TechStream Gaming is a global gaming company with 500 employees and $50M annual revenue.
Current Environment: On-premises data centers in US and EU. 200 servers. MySQL databases (5 TB). Peak users: 2M. Cost: $100K/mo.
Business Requirements: Reduce costs by 40%. Support 5x user growth. Launch in APAC, SA, Africa. Improve deployment to daily.
Executive Statements: CEO: 'Scale rapidly.' CFO: 'Max $100K/mo, ROI 18mo.' CTO: 'Limited cloud exp, 99.95% uptime.'
Technical Requirements: <100ms latency globally. Real-time analytics. 5x traffic spikes. EU data residency. DDoS protection. CI/CD.
Constraints: 12-month migration. Max 4-hour downtime. 20 devs (Java/MySQL), 5 ops (limited cloud). Budget $2M.

QUESTION:
How should you design the real-time analytics pipeline for player behavior data?

Answer options:

A.

Ingest data directly into Cloud Storage, process nightly with Cloud Dataproc, and load into Cloud SQL.

B.

Ingest data with Cloud Pub/Sub, process with Cloud Dataflow, and store in BigQuery.

C.

Write data directly from game servers to BigQuery using the streaming API.

D.

Use Cloud IoT Core for ingestion, Cloud Functions for processing, and Cloud Bigtable for storage.

How to approach this question

Identify the standard GCP streaming analytics pipeline components.

Full Answer

B.Ingest data with Cloud Pub/Sub, process with Cloud Dataflow, and store in BigQuery.✓ Correct
Ingest data with Cloud Pub/Sub, process with Cloud Dataflow, and store in BigQuery.
The combination of Cloud Pub/Sub (for scalable message ingestion and buffering during spikes), Cloud Dataflow (for real-time stream processing and transformation), and BigQuery (for real-time analytics querying) is the optimal architecture for this requirement.

Common mistakes

Selecting direct-to-BigQuery streaming, which lacks the resilience of a Pub/Sub buffer during massive traffic spikes.

Practice the full GCP Professional Cloud Architect Practice Exam 1

50 questions · hints · full answers · grading

More questions from this exam