Medium1 markMultiple Choice
Subtask 1.1: Business RequirementsAnalyticsPub/SubDataflowBigQuery
This question is part of a case study — click to read the full scenario(Case 01)

CASE STUDY: TechStream Gaming
Overview: Gaming company, 500 employees, $100M revenue. 200 on-prem servers (US/EU), MySQL 5TB. 2M peak users. $150K/mo cost.
Business Req: Reduce cost 40%, 5x growth, 3 new regions, daily deployments.
Execs: CEO wants scale; CFO caps budget at $100K/mo; CTO needs 99.95% uptime, notes team has limited cloud skills.
Tech Req: <100ms global latency, real-time analytics, 5x seasonal spikes, EU data residency, DDoS protection.
Constraints: 12-month migration, max 4-hour downtime.

QUESTION:
Which migration strategy should you recommend for the legacy gaming application to meet the 12-month timeline and team skill constraints?

GCP PCA · Question 04 · Business Requirements

CASE STUDY: TechStream Gaming
Overview: Gaming company, 500 employees, $100M revenue. 200 on-prem servers (US/EU), MySQL 5TB. 2M peak users. $150K/mo cost.
Business Req: Reduce cost 40%, 5x growth, 3 new regions, daily deployments.
Execs: CEO wants scale; CFO caps budget at $100K/mo; CTO needs 99.95% uptime, notes team has limited cloud skills.
Tech Req: <100ms global latency, real-time analytics, 5x seasonal spikes, EU data residency, DDoS protection.
Constraints: 12-month migration, max 4-hour downtime.

QUESTION:
To support the requirement for real-time analytics on player behavior, which architecture should you recommend?

Answer options:

A.

Write events directly to Cloud SQL and use read replicas for analytics.

B.

Stream events to Pub/Sub, process with Dataflow, and analyze in BigQuery.

C.

Upload batch logs to Cloud Storage hourly and load into BigQuery.

D.

Send events to Cloud Logging and create log-based metrics.

How to approach this question

Identify the GCP services designed for streaming data ingestion and real-time analytics.

Full Answer

B.Stream events to Pub/Sub, process with Dataflow, and analyze in BigQuery.✓ Correct
Stream events to Pub/Sub, process with Dataflow, and analyze in BigQuery.
Pub/Sub ingests high-throughput streaming data, Dataflow processes and transforms it in real-time, and BigQuery serves as the highly scalable data warehouse for analytics.

Common mistakes

Choosing batch processing (C) which ignores the 'real-time' requirement.

Practice the full GCP Professional Cloud Architect Practice Exam 5

50 questions · hints · full answers · grading

More questions from this exam