Medium1 markMultiple Choice
Subtask 4.1: Technical ProcessesData & AnalyticsPub/SubDataflowBigQuery
This question is part of a case study — click to read the full scenario(Case 01)

CASE STUDY: TechStream Gaming

Overview:
Industry: Gaming
Size: 500 employees, $100M revenue

Environment:

  • On-prem US/EU
  • 200 servers
  • MySQL (5 TB)
  • 2M peak users
  • $100K/mo cost

Requirements:

  • Reduce costs 40%
  • 5x growth
  • Launch APAC/SA/Africa
  • Daily deployments

Exec Statements:

  • CEO: Scale rapidly.
  • CFO: Max $100K/mo, 18mo ROI.
  • CTO: Limited cloud exp, 99.95% uptime.

Tech Reqs:

  • <100ms latency globally
  • Real-time analytics
  • 5x seasonal spikes
  • EU data residency
  • DDoS protection
  • CI/CD

Constraints:

  • 12mo migration
  • <4hr downtime
  • 20 Java/MySQL devs, 5 ops
  • $2M budget

QUESTION: Based on the constraints and requirements, which migration strategy should you recommend for the core gaming application?

GCP PCA · Question 04 · Technical Processes

CASE STUDY: TechStream Gaming

Overview:
Industry: Gaming
Size: 500 employees, $100M revenue

Environment:

  • On-prem US/EU
  • 200 servers
  • MySQL (5 TB)
  • 2M peak users
  • $100K/mo cost

Requirements:

  • Reduce costs 40%
  • 5x growth
  • Launch APAC/SA/Africa
  • Daily deployments

Exec Statements:

  • CEO: Scale rapidly.
  • CFO: Max $100K/mo, 18mo ROI.
  • CTO: Limited cloud exp, 99.95% uptime.

Tech Reqs:

  • <100ms latency globally
  • Real-time analytics
  • 5x seasonal spikes
  • EU data residency
  • DDoS protection
  • CI/CD

Constraints:

  • 12mo migration
  • <4hr downtime
  • 20 Java/MySQL devs, 5 ops
  • $2M budget

QUESTION: How should you design the real-time analytics pipeline for player behavior?

Answer options:

A.

Ingest events with Cloud Storage, process with Dataproc, and store in Cloud SQL.

B.

Ingest events with Pub/Sub, process with Dataflow, and store in BigQuery.

C.

Ingest events with Cloud IoT Core, process with Cloud Functions, and store in Cloud Spanner.

D.

Write events directly to BigQuery using the streaming API from the game clients.

How to approach this question

Identify the standard GCP serverless data pipeline components for streaming data.

Full Answer

B.Ingest events with Pub/Sub, process with Dataflow, and store in BigQuery.✓ Correct
Ingest events with Pub/Sub, process with Dataflow, and store in BigQuery.
The combination of Pub/Sub (messaging buffer), Dataflow (Apache Beam stream processing), and BigQuery (serverless data warehouse) provides a fully managed, auto-scaling real-time analytics pipeline that requires minimal operational overhead.

Common mistakes

Choosing direct BigQuery streaming (D), which fails to account for traffic spikes that could overwhelm the system without a Pub/Sub buffer.

Practice the full GCP Professional Cloud Architect Practice Exam 6

50 questions · hints · full answers · grading

More questions from this exam