Table of Contents

Search

  1. Preface
  2. Part 1: Introduction to Google BigQuery connectors
  3. Part 2: Data Integration with Google BigQuery V2 Connector
  4. Part 3: Data Integration with Google BigQuery Connector

Google BigQuery Connectors

Google BigQuery Connectors

Administration of Google BigQuery Connector

Administration of Google BigQuery Connector

Google BigQuery is a RESTful web service that the Google Cloud Platform provides.
Before you use Google BigQuery Connector, you must complete the following prerequisite tasks:
  • Create a Google account to access Google BigQuery.
  • On the
    Credentials
    page, navigate to the APIs and auth section, and create a service account. After you create the service account, you can download a JSON file that contains the client_email, project_id, and private_key values. You will need to enter these details when you create a Google BigQuery connection in
    Data Integration
    .
    The following image shows the
    Credentials
    page where you can create the service account and key:
    The image shows the Credentials page where you can create a service account and obtain a service account key.
  • On the
    Dashboards
    page of the Google API Console, https://console.developers.google.com/, enable the
    BigQuery API
    and
    Google Cloud Storage JSON API
    . Google BigQuery Connector uses the Google APIs to integrate with Google BigQuery and Google Cloud Storage.
    The following image shows the
    Dashboard
    page where you can enable the APIs:
    The image shows the Dashboard page where you can enable the Google APIs.
  • Create a project and dataset in Google BigQuery. Verify that the dataset contains the source table and the target table. You will need to enter the project ID, dataset ID, source table name, and target table name when you create tasks and mappings in
    Data Integration
    .
    The following image shows a project:
    The image shows a sample project.
  • Verify that you have read and write access to the Google BigQuery dataset that contains the source table and target table.
  • When you read data from or write data to a Google BigQuery table, you must have the following permissions:
    • bigquery.datasets.create
    • bigquery.datasets.get
    • bigquery.datasets.getIamPolicy
    • bigquery.datasets.updateTag
    • bigquery.models.*
    • bigquery.routines.*
    • bigquery.tables.create
    • bigquery.tables.delete
    • bigquery.tables.export
    • bigquery.tables.get
    • bigquery.tables.getData
    • bigquery.tables.list
    • bigquery.tables.update
    • bigquery.tables.updateData
    • bigquery.tables.updateTag
    • resourcemanager.projects.get
    • resourcemanager.projects.list
    • bigquery.jobs.create
  • When you only read data from a Google BigQuery table, you must have the following permissions:
    • bigquery.datasets.get
    • bigquery.datasets.getIamPolicy
    • bigquery.models.getData
    • bigquery.models.getMetadata
    • bigquery.models.list
    • bigquery.routines.get
    • bigquery.routines.list
    • bigquery.tables.export
    • bigquery.tables.get
    • bigquery.tables.getData
    • bigquery.tables.list
    • resourcemanager.projects.get
    • resourcemanager.projects.list
    • bigquery.jobs.create
    • bigquery.tables.create
  • If your organization passes data through a proxy or protective firewall, you must configure your firewall to allow the
    www.googleapis.com
    URI for Google BigQuery Connector to transfer data through a proxy or firewall.
  • If you use bulk mode, verify that you have write access to the Google Cloud Storage path where the Secure Agent creates the staging file.
  • If you use staging mode, verify that you have read access to the Google Cloud Storage path where the Secure Agent creates the staging file to store the data from the Google BigQuery source.

0 COMMENTS

We’d like to hear from you!