Shared Content for Data Engineering All Products
Separate multiple options with a space. To enter a value that contains a space or other non-alphanumeric character, enclose the value in quotation marks.... -o option_name=value option_name=value ...
./infacmd.sh createconnection dn Domain_Google -un Administrator -pd Administrator -cn GBQ_cmd -ct BIGQUERY -o "CLIENTEMAILemail@example.com PRIVATEKEY='---BEGIN PRIVATE KEY---\nabcd1234322dsa\n---END PRIVATE KEY----\n' PROJECTID=api-project-80697026669 CONNECTORTYPE=Complex SCHEMALOCATION='gs://01bucket' STORAGEPATH='gs://01bucket'"
Required. Specifies the client_email value present in the JSON file that you download after you create a service account in Google BigQuery.
Required. Specifies the private_key value present in the JSON file that you download after you create a service account in Google BigQuery.
Required. The connection mode that you want to use to read data from or write data to Google BigQuery.
Enter one of the following connection modes:
Default is Simple.
Schema Definition File Path
Required. Specifies a directory on the client machine where the PowerExchange for Google BigQuery must create a JSON file with the sample schema of the Google BigQuery table. The JSON file name is the same as the Google BigQuery table name.
Alternatively, you can specify a storage path in Google Cloud Storage where the PowerExchange for Google BigQuery must create a JSON file with the sample schema of the Google BigQuery table. You can download the JSON file from the specified storage path in Google Cloud Storage to a local machine.
Required. Specifies the project_id value present in the JSON file that you download after you create a service account in Google BigQuery.
If you have created multiple projects with the same service account, enter the ID of the project that contains the dataset that you want to connect to.
Required when you read or write large volumes of data.
Path in Google Cloud Storage where PowerExchange for Google BigQuery creates a local stage file to store the data temporarily.
You can either enter the bucket name or the bucket name and folder name.
For example, enter