For IndividualsFor Educators
ExpertMinds LogoExpertMinds
ExpertMinds

Ace your certifications with Practice Exams and AI assistance.

  • Browse Exams
  • For Educators
  • Blog
  • Privacy Policy
  • Terms of Service
  • Cookie Policy
  • Support
  • AWS SAA Exam Prep
  • PMI PMP Exam Prep
  • CPA Exam Prep
  • GCP PCA Exam Prep

© 2026 TinyHive Labs. Company number 16262776.

    PracticeGCP Professional Cloud ArchitectTopicsDomain 2: Manage & ProvisionSubtask 2.2: Storage Systems
    Domain 2: Manage & Provision

    Subtask 2.2: Storage Systems

    11 questions across 4 exams

    Other subtopics in Domain 2: Manage & Provision
    Subtask 2.1: Configure network topologies1qSubtask 2.1: Network topologies4qSubtask 2.1: Network Topology5qSubtask 2.3: Compute Systems15qSubtask 2.3: Configure compute systems2q

    Covered in these exams

    GCP Professional Cloud Architect Practice Exam 1GCP Professional Cloud Architect Practice Exam 5GCP Professional Cloud Architect Practice Exam 6GCP Professional Cloud Architect Practice Exam 7

    All questions (11)

    Q12Medium1 mark·GCP Professional Cloud Architect Practice Exam 1

    **CASE STUDY: TerramEarth** **Company Overview:** TerramEarth manufactures heavy equipment. 2 million vehicles in the field. **Current Environment:** Vehicles send telemetry via cellular. Processing 100,000 msgs/sec. On-prem Hadoop cluster. **Business Requirements:** Predict equipment failure. Reduce warranty costs. Provide fleet dashboard. **Executive Statements:** CEO: 'Monetize data.' CFO: 'Storage costs spiraling.' CTO: 'Need scalable ingestion and ML.' **Technical Requirements:** Ingest 500,000 msgs/sec. Store petabytes cost-effectively. Train ML models. Real-time anomaly detection. **Constraints:** Intermittent connectivity. Strict vehicle authentication. **QUESTION:** To address the CFO's concern about spiraling storage costs for petabytes of historical telemetry data, what should you recommend?

    Worked answer available with free account
    View question →
    Q25Easy1 mark·GCP Professional Cloud Architect Practice Exam 1

    A development team is building a new application that requires a relational database. The database will be deployed in a single region, needs to support up to 10 TB of data, and requires automated backups and high availability. Which GCP service should you recommend?

    Worked answer available with free account
    View question →
    Q12Easy1 mark·GCP Professional Cloud Architect Practice Exam 5

    CASE STUDY: AeroMech Overview: Aviation manufacturer, 5000 employees, $2B revenue. 100 engines, 10k sensors/engine, 1GB data/flight. On-prem Hadoop. Business Req: Predictive maintenance, secure data sharing with airlines, monetize data. Execs: CEO wants new revenue; CFO demands ML ROI; CTO says on-prem storage unfeasible. Tech Req: High-throughput ingestion, PB-scale storage, train ML on historical data, deploy ML to edge (aircraft). Constraints: Intermittent low-bandwidth flight connectivity, aviation data compliance, data scientists use Python/Jupyter. QUESTION: To manage the PB-scale storage of historical flight data cost-effectively, what should you implement?

    Worked answer available with free account
    View question →
    Q37Medium1 mark·GCP Professional Cloud Architect Practice Exam 5

    Your company generates daily log files that must be kept for 5 years for compliance reasons. The logs are accessed frequently for the first 30 days, rarely accessed between 30 and 365 days, and almost never accessed after 1 year. Which TWO Cloud Storage classes should you use in your Object Lifecycle policy to optimize costs? (Select TWO)

    Worked answer available with free account
    View question →
    Q42Medium1 mark·GCP Professional Cloud Architect Practice Exam 5

    You are configuring a Cloud SQL for PostgreSQL instance for a production application. The application is read-heavy. You need to ensure the database survives a zone failure and that read queries do not impact the performance of write queries. Which THREE configurations should you implement? (Select THREE)

    Worked answer available with free account
    View question →
    Q13Medium1 mark·GCP Professional Cloud Architect Practice Exam 6

    CASE STUDY: HealthData Inc Overview: Industry: Healthcare Analytics Size: 1000 employees Environment: - Co-located data center - Hadoop cluster - SFTP servers - 50 TB patient data Requirements: - ML models for diagnostics - Secure data sharing portals - Break data silos Exec Statements: - CEO: Need compute for ML. - CRO: HIPAA compliance is top priority. - CTO: Managed services needed to replace Hadoop. Tech Reqs: - Strict HIPAA compliance - Automated PHI de-identification - Comprehensive audit logging - CMEK - Network isolation (no public internet) Constraints: - US data sovereignty - 7-year retention (immutable) - Easy auditor access QUESTION: To meet the 7-year immutable data retention requirement for patient records, how should you configure Cloud Storage?

    Worked answer available with free account
    View question →
    Q17Medium1 mark·GCP Professional Cloud Architect Practice Exam 6

    CASE STUDY: ManuIoT Overview: Industry: Manufacturing Size: 100 factories globally Environment: - 100,000 sensors - Local SCADA - Fragmented SQL Server DBs - No central analytics Requirements: - Predictive maintenance - Real-time global dashboards - Edge computing Exec Statements: - CEO: Monetize telemetry. - CFO: Costs must scale linearly. - VP Ops: Factory lines need local control if internet drops. Tech Reqs: - Ingest 1M msgs/sec - Stream processing - Offline factory capabilities - Train ML centrally, deploy to edge Constraints: - Low bandwidth/high latency at factories - Legacy MQTT protocol - Zero IT staff at factories QUESTION: Which database service should you select to store the high-throughput time-series telemetry data for real-time dashboarding?

    Worked answer available with free account
    View question →
    Q38Medium1 mark·GCP Professional Cloud Architect Practice Exam 6

    You are migrating a mission-critical MySQL database to Cloud SQL. The business requires that the database remains available even if an entire GCP zone goes offline, and they need the ability to recover the database to a specific point in time if a developer accidentally drops a table. Which TWO features must you enable? (Select TWO)

    Worked answer available with free account
    View question →
    Q12Easy1 mark·GCP Professional Cloud Architect Practice Exam 7

    CASE STUDY: AutoMakers Inc Company Overview: AutoMakers Inc is a leading vehicle manufacturer transitioning to connected and autonomous vehicles. They need a platform to ingest, process, and analyze telemetry data from millions of cars. Current Technical Environment: - Legacy MQTT brokers on-premises. - Hadoop cluster for batch processing (nightly runs). - 100,000 connected cars sending 1 KB of data every minute. - On-premises data warehouse reaching capacity. Business Requirements: - Support 5 million connected cars within 3 years. - Enable real-time alerting for critical vehicle faults. - Provide predictive maintenance insights to customers. - Monetize anonymized traffic data. Executive Statements: - CEO: "Data is our new engine. We need real-time insights to improve safety." - CFO: "The platform must scale cost-effectively. We only want to pay for what we use." - CTO: "We need a fully managed serverless data pipeline to minimize operational overhead." Technical Requirements: - Ingest up to 1 million messages per second with low latency. - Process data in real-time for anomaly detection. - Store raw telemetry data indefinitely for machine learning model training. - Provide a scalable data warehouse for business intelligence analysts. Constraints: - Strict data privacy regulations (GDPR) require masking of PII. - Limited data engineering staff; prefer managed services. - Must integrate with existing on-premises identity provider (Active Directory). QUESTION: To meet the requirement to store raw telemetry data indefinitely for machine learning model training while adhering to the CFO's cost constraints, which storage solution should you use?

    Worked answer available with free account
    View question →
    Q24Easy1 mark·GCP Professional Cloud Architect Practice Exam 7

    A media company stores millions of high-resolution images in Cloud Storage. Images are accessed frequently during the first 30 days after publication. After 30 days, they are rarely accessed but must be kept for 5 years for compliance. After 5 years, they should be deleted. How can you automate this process most cost-effectively?

    Worked answer available with free account
    View question →
    Q38Hard1 mark·GCP Professional Cloud Architect Practice Exam 7

    You are designing the database architecture for a mission-critical application using Cloud SQL for PostgreSQL. The application requires High Availability (HA) within the primary region to survive zone failures, and Disaster Recovery (DR) in a secondary region to survive a full region outage. Which TWO configurations must you implement? (Select TWO)

    Worked answer available with free account
    View question →

    Practice these questions with detailed guidance

    Full answers, grading, and explanations on why each answer is correct.

    Sign up freeBrowse exams