Medium1 markMultiple Choice
Subtask 2.2: Storage SystemsCloud StorageCost OptimizationLifecycle Policies
This question is part of a case study — click to read the full scenario(Case 11)

CASE STUDY: TerramEarth

Company Overview: TerramEarth manufactures heavy equipment. 2 million vehicles in the field.
Current Environment: Vehicles send telemetry via cellular. Processing 100,000 msgs/sec. On-prem Hadoop cluster.
Business Requirements: Predict equipment failure. Reduce warranty costs. Provide fleet dashboard.
Executive Statements: CEO: 'Monetize data.' CFO: 'Storage costs spiraling.' CTO: 'Need scalable ingestion and ML.'
Technical Requirements: Ingest 500,000 msgs/sec. Store petabytes cost-effectively. Train ML models. Real-time anomaly detection.
Constraints: Intermittent connectivity. Strict vehicle authentication.

QUESTION:
Which architecture should you design to handle the ingestion of 500,000 messages per second from vehicles with intermittent connectivity?

GCP PCA · Question 12 · Storage Systems

CASE STUDY: TerramEarth

Company Overview: TerramEarth manufactures heavy equipment. 2 million vehicles in the field.
Current Environment: Vehicles send telemetry via cellular. Processing 100,000 msgs/sec. On-prem Hadoop cluster.
Business Requirements: Predict equipment failure. Reduce warranty costs. Provide fleet dashboard.
Executive Statements: CEO: 'Monetize data.' CFO: 'Storage costs spiraling.' CTO: 'Need scalable ingestion and ML.'
Technical Requirements: Ingest 500,000 msgs/sec. Store petabytes cost-effectively. Train ML models. Real-time anomaly detection.
Constraints: Intermittent connectivity. Strict vehicle authentication.

QUESTION:
To address the CFO's concern about spiraling storage costs for petabytes of historical telemetry data, what should you recommend?

Answer options:

A.

Store all data in Cloud Spanner to ensure strong consistency.

B.

Store data in Cloud Storage Standard class, and use Object Lifecycle Management to move data older than 90 days to Coldline or Archive.

C.

Store all data in Persistent Disk SSDs attached to Compute Engine instances.

D.

Delete data older than 30 days to save costs.

How to approach this question

Identify the GCP service designed for cheap, petabyte-scale object storage and its cost-optimization features.

Full Answer

B.Store data in Cloud Storage Standard class, and use Object Lifecycle Management to move data older than 90 days to Coldline or Archive.✓ Correct
Store data in Cloud Storage Standard class, and use Object Lifecycle Management to move data older than 90 days to Coldline or Archive.
Cloud Storage is the optimal service for petabyte-scale data lakes. Object Lifecycle Management allows you to define rules to automatically downgrade the storage class of objects (e.g., from Standard to Nearline, Coldline, or Archive) based on age, drastically reducing costs for historical data.

Common mistakes

Selecting a database (Spanner/SQL) for raw telemetry storage, which is cost-prohibitive at petabyte scale.

Practice the full GCP Professional Cloud Architect Practice Exam 1

50 questions · hints · full answers · grading

More questions from this exam