You can create dataflow jobs very quick using the built-in templates provided. Which types of dataflow templates are available to users using the dataflow template dropdown menu?
Process data continuously (stream)
Process data in bulk (Batch)
Utilities
:
When transferring data into a GCP cloud storage bucket, what are the available options that you can select as a source to transfer your data from?
Google cloud storage bucket
Amazon S3 bucket
Azure storage container
:
When changing storage classes for your cloud storage buckets, sometimes you need to transform the objects so the new storage class can take effect. Which command is used to accomplish this?
rewrite
:
Which history options are available to you when viewing the history of your jobs in BigQuery?
Project history
Personal history
:
Cloud SQL performs two distinctly different types of backups, what are these two types?
On-Demand backups
Automated backups
:
When importing Firestore entities, which file type must be used to start the import?
overall_export_metadata
:
When using BigQuery to query data from GCP data instances, which external data sources are supported?
BigTable
Cloud SQL
GCP cloud storage
:
A system administrator is using the command line to view information about a bucket called storagebucket709281. The administrator wants to view all the labels that are associated with that bucket; which command should they run?
gsutil ls -L -b gs://storagebucket709281
:
To view all buckets in your Google cloud project using gsutil, which command should you specify?
gsutil ls
:
You need to set up a policy so that files stored in a cloud storage bucket are moved to nearline after 30 days, and then deleted after one year from their creation. How should you set up the policy?
Use cloud storage object lifecycle management. Set the SetStorageClass action to 30 days and the delete action to 365 days
:
When trying to estimate query costs for BigQuery, which two options are available to a user?
Query validator
Google cloud pricing calculator
:
When running Dataproc jobs, which options are the available job types that you can select from the dropdown menu?
Spark
Hadoop
Hive