Table of Contents

Search

  1. Preface
  2. Introduction to PowerExchange for Google BigQuery
  3. PowerExchange for Google BigQuery Configuration
  4. Google BigQuery Sources and Targets
  5. Google BigQuery Mappings
  6. Google BigQuery Sessions
  7. Google BigQuery as CDC Target
  8. Google BigQuery Pushdown Optimization
  9. Appendix A: Google BigQuery Data Type Reference

PowerExchange for Google BigQuery User Guide for PowerCenter

PowerExchange for Google BigQuery User Guide for PowerCenter

Import Google BigQuery Source and Target Definitions

Import Google BigQuery Source and Target Definitions

Use the
Create PowerExchange for BigQuery Source
or
Create PowerExchange for BigQuery Target
wizard to import Google BigQuery source and target definitions into the PowerCenter repository.
You must import Google BigQuery source and target objects before you create a mapping.
  1. Start PowerCenter Designer, and connect to a PowerCenter repository configured with a Google BigQuery instance.
  2. Open a source or target folder.
  3. Select
    Source Analyzer
    or
    Target Designer
    .
    • In the Source Analyzer, click
      Sources
      Create PowerExchange for BigQuery Source
      as shown in the following image:
      The image shows the location of the Create PowerExchange For Google BigQuery Source option.
      The
      Google BigQuery Connection
      wizard appears.
    • In the Target Analyzer, click
      Targets
      Create PowerExchange for BigQuery Target
      as shown in the following image:
      The image shows the location of the Create PowerExchange For Google BigQuery Target option.
      The
      Google BigQuery Connection
      dialog box appears.
  4. Configure the following connection parameters:
    Connection Parameter
    Description
    Service Account ID
    Specifies the client_email value present in the JSON file that you download after you create a service account.
    Service Account Key
    Specifies the private_key value present in the JSON file that you download after you create a service account.
    Connection mode
    The mode that you want to use to read data from or write data to Google BigQuery.
    Select one of the following connection modes:
    • Simple. Flattens each field within the Record data type field as a separate field in the mapping.
    • Hybrid. Displays all the top-level fields in the Google BigQuery table including Record data type fields. PowerExchange for Google BigQuery displays the top-level Record data type field as a single field of the String data type in the mapping.
    • Complex. Displays all the columns in the Google BigQuery table as a single field of the String data type in the mapping.
    Default is Simple.
    Schema Definition File Path
    Specifies a directory on the client machine where the PowerCenter Integration Service must create a JSON file with the sample schema of the Google BigQuery table. The JSON file name is the same as the Google BigQuery table name.
    Alternatively, you can specify a storage path in Google Cloud Storage where the PowerCenter Integration Service must create a JSON file with the sample schema of the Google BigQuery table. You can download the JSON file from the specified storage path in Google Cloud Storage to a local machine.
    Project ID
    Specifies the project_id value present in the JSON file that you download after you create a service account.
    If you have created multiple projects with the same service account, enter the ID of the project that contains the dataset that you want to connect to.
    Storage Path
    This property applies when you read or write large volumes of data.
    Path in Google Cloud Storage where the PowerCenter Integration Service creates a local stage file to store the data temporarily.
    You can either enter the bucket name or the bucket name and folder name.
    For example, enter
    gs://<bucket_name>
    or
    gs://<bucket_name>/<folder_name>
    Use Legacy SQL For Custom Query
    Uses Legacy SQL to define a custom query.
    You can clear this option, you must use Standard SQL to define a custom query.
    Dataset Name for Custom Query
    When you define a custom query, you must specify a Google BigQuery dataset.
    Region id
    The region name where the Google BigQuery dataset resides.
    For example, if you want to connect to a Google BigQuery dataset that resides in Las Vegas region, specify
    us-west4
    as the
    Region ID
    .
    In the
    Storage Path
    connection property, ensure that you specify a bucket name or the bucket name and folder name that resides in the same region as the dataset in Google BigQuery.
    For more information about the regions supported by Google BigQuery, see the following Google BigQuery documentation:https://cloud.google.com/bigquery/docs/locations
    Optional Properties
    Specifies whether you can configure certain source and target functionalities through custom properties.
    You can select one of the following options:
    • None. If you do not want to configure any custom properties, select None.
    • Required. If you want to specify custom properties to configure the source and target functionalities.
    Default is None.
    Provide Optional Properties
    Comma-separated key-value pairs of custom properties in the Google BigQuery connection to configure certain source and target functionalities.
    Appears only when you select
    Required
    in the Optional Properties.
  5. Click
    Test
    to test the connection.
  6. Click
    Finish
    to add the connection.
    The
    Select Objects from
    tab appears.
  7. Select the dataset in
    Package Explorer
    .
    A list of table appears as shown in the following image:
    The image shows the Package Explorer tab where a list of table appears after you select the dataset.
  8. Select the table that you want to import, and then click
    Import
    . You can import multiple tables from a Google BigQuery dataset.
    To view the table metadata, select the table, and double-click the table name.

0 COMMENTS

We’d like to hear from you!