Google BigQuery is a RESTful web service that the Google Cloud Platform provides.
Before you use Google BigQuery Connector, you must complete the following prerequisite tasks:
Create a Google account to access Google BigQuery.
On the
Credentials
page, navigate to the APIs and auth section, and create a service account. After you create the service account, you can download a JSON file that contains the client_email, project_id, and private_key values. You will need to enter these details when you create a Google BigQuery connection in
Data Integration
.
The following image shows the
Credentials
page where you can create the service account and key:
. Google BigQuery Connector uses the Google APIs to integrate with Google BigQuery and Google Cloud Storage.
The following image shows the
Dashboard
page where you can enable the APIs:
Create a project and dataset in Google BigQuery. Verify that the dataset contains the source table and the target table. You will need to enter the project ID, dataset ID, source table name, and target table name when you create tasks and mappings in
Data Integration
.
The following image shows a project:
Verify that you have read and write access to the Google BigQuery dataset that contains the source table and target table.
When you read data from or write data to a Google BigQuery table, you must have the following permissions:
bigquery.datasets.create
bigquery.datasets.get
bigquery.datasets.getIamPolicy
bigquery.datasets.updateTag
bigquery.models.*
bigquery.routines.*
bigquery.tables.create
bigquery.tables.delete
bigquery.tables.export
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list
bigquery.tables.update
bigquery.tables.updateData
bigquery.tables.updateTag
resourcemanager.projects.get
resourcemanager.projects.list
bigquery.jobs.create
When you only read data from a Google BigQuery table, you must have the following permissions:
bigquery.datasets.get
bigquery.datasets.getIamPolicy
bigquery.models.getData
bigquery.models.getMetadata
bigquery.models.list
bigquery.routines.get
bigquery.routines.list
bigquery.tables.export
bigquery.tables.get
bigquery.tables.getData
bigquery.tables.list
resourcemanager.projects.get
resourcemanager.projects.list
bigquery.jobs.create
bigquery.tables.create
If your organization passes data through a proxy or protective firewall, you must configure your firewall to allow the
www.googleapis.com
URI for Google BigQuery Connector to transfer data through a proxy or firewall.
If you use bulk mode, verify that you have write access to the Google Cloud Storage path where the Secure Agent creates the staging file.
If you use staging mode, verify that you have read access to the Google Cloud Storage path where the Secure Agent creates the staging file to store the data from the Google BigQuery source.