Medium1 markMultiple Choice
Subtask 1.2: Technical RequirementsPub/SubDataflowBigQueryReal-time Analytics
This question is part of a case study — click to read the full scenario(Case 01)

CASE STUDY: TechStream Gaming

Company Overview:
TechStream Gaming is a global multiplayer game developer with 500 employees and $100M annual revenue. They recently launched a hit mobile game that is growing rapidly.

Current Technical Environment:

  • On-premises data centers in US and EU.
  • 200 bare-metal servers running Linux.
  • Self-managed MySQL databases (5 TB total) for player profiles and inventory.
  • Peak concurrent users: 2 million.
  • Current monthly infrastructure cost: $150K.

Business Requirements:

  • Reduce infrastructure costs by 40%.
  • Support 5x user growth over 2 years.
  • Launch in 3 new regions (APAC, SA, Africa).
  • Improve deployment speed from 1 week to daily.

Executive Statements:

  • CEO: "We need to scale rapidly to compete. Cloud migration is critical."
  • CFO: "Cost reduction is paramount. We cannot exceed $100K/month. ROI must be achieved in 18 months."
  • CTO: "Our team has limited cloud experience. Reliability is non-negotiable - 99.95% uptime minimum."

Technical Requirements:

  • Sub-100ms latency for players globally.
  • Real-time analytics on player behavior.
  • Seasonal traffic spikes (5x during holidays).
  • CI/CD pipeline for daily deployments.

Constraints:

  • Migration must complete in 12 months.
  • Cannot exceed 4-hour downtime during cutover.
  • Dev team: 20 engineers (Java, MySQL).
  • Ops team: 5 engineers (limited cloud experience).

QUESTION:
Which database migration strategy should you recommend to meet the global scalability and latency requirements while minimizing operational overhead for the ops team?

GCP PCA · Question 02 · Technical Requirements

CASE STUDY: TechStream Gaming

Company Overview:
TechStream Gaming is a global multiplayer game developer with 500 employees and $100M annual revenue. They recently launched a hit mobile game that is growing rapidly.

Current Technical Environment:

  • On-premises data centers in US and EU.
  • 200 bare-metal servers running Linux.
  • Self-managed MySQL databases (5 TB total) for player profiles and inventory.
  • Peak concurrent users: 2 million.
  • Current monthly infrastructure cost: $150K.

Business Requirements:

  • Reduce infrastructure costs by 40%.
  • Support 5x user growth over 2 years.
  • Launch in 3 new regions (APAC, SA, Africa).
  • Improve deployment speed from 1 week to daily.

Executive Statements:

  • CEO: "We need to scale rapidly to compete. Cloud migration is critical."
  • CFO: "Cost reduction is paramount. We cannot exceed $100K/month. ROI must be achieved in 18 months."
  • CTO: "Our team has limited cloud experience. Reliability is non-negotiable - 99.95% uptime minimum."

Technical Requirements:

  • Sub-100ms latency for players globally.
  • Real-time analytics on player behavior.
  • Seasonal traffic spikes (5x during holidays).
  • CI/CD pipeline for daily deployments.

Constraints:

  • Migration must complete in 12 months.
  • Cannot exceed 4-hour downtime during cutover.
  • Dev team: 20 engineers (Java, MySQL).
  • Ops team: 5 engineers (limited cloud experience).

QUESTION:
To meet the requirement for real-time analytics on player behavior, which architecture should you recommend?

Answer options:

A.

Export database backups nightly to Cloud Storage and load them into BigQuery.

B.

Stream player events to Pub/Sub, process with Dataflow, and analyze in BigQuery.

C.

Write player events directly to Cloud SQL and use Datastream to replicate to BigQuery.

D.

Send events to Cloud Logging and create a sink to Cloud Storage for analysis with Dataproc.

How to approach this question

Identify the keywords 'real-time analytics' and 'player behavior'. Look for the streaming ingestion and processing pattern in GCP.

Full Answer

B.Stream player events to Pub/Sub, process with Dataflow, and analyze in BigQuery.✓ Correct
Stream player events to Pub/Sub, process with Dataflow, and analyze in BigQuery.
The Pub/Sub -> Dataflow -> BigQuery pipeline is the canonical GCP architecture for real-time analytics. It is fully managed, scales automatically to handle the 5x user growth and seasonal spikes, and requires minimal operational overhead.

Common mistakes

Selecting batch processing (Option A) ignores the 'real-time' requirement.

Practice the full GCP Professional Cloud Architect Practice Exam 7

50 questions · hints · full answers · grading

More questions from this exam