Medium1 markMultiple Choice
This question is part of a case study — click to read the full scenario(Case 11)

CASE STUDY: AutoMakers Inc. 1M connected cars, 100GB/day telemetry. Req: Predictive maintenance, real-time driver dashboard, monetize data. CEO: Data is new engine. CFO: Cut 3rd-party IoT costs. CTO: Highly scalable ingest. Tech: MQTT ingest, stream processing, ML models, 7-yr cold storage, handle intermittent connectivity. Constraints: Anonymize data, low vehicle compute, strict analytics budget.

How should you architect the highly scalable ingestion layer for MQTT telemetry data from 1 million cars?

GCP PCA · Question 13 · Domain 3: Designing for Security and Compliance

CASE STUDY: AutoMakers Inc. 1M connected cars, 100GB/day telemetry. Req: Predictive maintenance, real-time driver dashboard, monetize data. CEO: Data is new engine. CFO: Cut 3rd-party IoT costs. CTO: Highly scalable ingest. Tech: MQTT ingest, stream processing, ML models, 7-yr cold storage, handle intermittent connectivity. Constraints: Anonymize data, low vehicle compute, strict analytics budget.

Which service should you integrate into the streaming pipeline to automatically anonymize Vehicle Identification Numbers (VINs) before data scientists access it?

Answer options:

A.

Cloud Key Management Service (KMS)

B.

Cloud Data Loss Prevention (DLP) API

C.

Identity and Access Management (IAM)

D.

VPC Service Controls

How to approach this question

Identify the GCP service for data redaction and anonymization.

Full Answer

B.Cloud Data Loss Prevention (DLP) API✓ Correct
Cloud Data Loss Prevention (DLP) API
Cloud DLP API can be integrated into Dataflow pipelines to automatically detect and tokenize/redact sensitive PII (like VINs) before it lands in BigQuery.

Common mistakes

Choosing KMS, which encrypts the whole file/disk, not specific data fields.

Practice the full GCP Professional Cloud Architect Practice Exam 2

50 questions · hints · full answers · grading

More questions from this exam