Medium1 markMultiple Choice
Domain 3.4: Deploying and implementing data solutionsBigQuerybq CLIData IngestionCloud Storage

GCP ACE · Question 27 · Domain 3.4: Deploying and implementing data solutions

You have a large CSV file containing historical sales data stored in a Cloud Storage bucket (gs://my-data-bucket/sales.csv). You need to load this data into an existing BigQuery table named 'sales_history' in the 'analytics' dataset.

Which command-line tool and command should you use?

Answer options:

A.

gsutil cp gs://my-data-bucket/sales.csv bq://analytics/sales_history

B.

gcloud bigquery import gs://my-data-bucket/sales.csv analytics.sales_history

C.

bq load analytics.sales_history gs://my-data-bucket/sales.csv

D.

bq query --destination_table=analytics.sales_history 'SELECT * FROM gs://my-data-bucket/sales.csv'

How to approach this question

Remember that BigQuery has its own dedicated CLI tool (`bq`). The command to ingest data is `bq load`.

Full Answer

C.bq load analytics.sales_history gs://my-data-bucket/sales.csv✓ Correct
bq load analytics.sales_history gs://my-data-bucket/sales.csv
To interact with BigQuery from the command line, you use the `bq` command-line tool, which is included in the Cloud SDK. To load data from a file (like a CSV in Cloud Storage) into a BigQuery table, the correct command is `bq load`. The syntax is `bq load [DATASET].[TABLE] [SOURCE_URI]`.

Common mistakes

Trying to use `gsutil cp` or `gcloud` to interact with BigQuery tables.

Practice the full GCP Associate Cloud Engineer Practice Exam 6

50 questions · hints · full answers · grading

More questions from this exam