Data Integration Connectors
- Data Integration Connectors
- All Products
CR
| Description
|
---|---|
CCON-45305
| When the unconnected lookup contains a function in the :LKP expression, the mapping fails with an SQL compilation error.
(October 2022)
|
CCON-44975
| When you update a Google BigQuery target using an update column of the Float data type and select the
Disable Duplicate Update Rows target advanced property, the mapping fails with the following error:
[ERROR] The [QUERY] job failed with the following error: [Partitioning by expressions of type FLOAT64 is not allowed.
|
CCON-42544
| When you enable bulk mode with CSV staging file format in a mapping that uses a Google BigQuery V2 connection in complex mode to write data to a Google BigQuery target, on encountering an error, the error message that appears for mappings that stage data locally in a flat file differs from mappings that stage data on Google Cloud Storage.
(October 2022)
|
CR
| Description
|
---|---|
CCI-2446
| When you run a mapping with multiple pipelines and staging enabled, and a pipeline with cached source lookup and another pipeline with uncached source lookup, the mapping fails with the following error:
TM_6085 A fatal error occurred at transformation [Lookup1], and the session is terminating.
When you run a mapping with multiple pipelines and staging disabled, and the pipelines are with uncached source lookup, the mapping fails with the following error:
[ERROR] Error occurred while writing to output buffer in Lookup Transformation - Row index [1] is out of the buffer capacity range.
(February 2023)
|
CCON-49268
| When you run a mapping in advanced mode to read data from a Google BigQuery source and write data to a Google BigQuery target, you see data mismatch or the mapping fails when the boundary values of the following data types go out of range:
(October 2022)
|
CCON-49302
CCON-49301
| When you run a mapping in advanced mode to read data from a Google BigQuery source and write data to a Google BigQuery target, and if you configure simple filter on columns with the following data types in the Google BigQuery source, you see data mismatch:
(October 2022)
|
CCON-41646
| A mapping configured in advanced mode that reads data of the Numeric data type with 28 digits and precision set to 28 writes null values to the Google BigQuery target column with precision set to 28.
Workaround: To write the data to the column of the Numeric data type successfully, set the precision to 29 and scale to 9.
|
CCON-33770
| When you run a mapping in advanced mode and configure the Google Cloud Storage path in the Google BigQuery V2 connection, the mapping runs successfully, but the Spark Driver log displays the following error:
GoogleHadoopFileSystemBase - No working directory configured, using default.
Workaround: Ignore the error message.
|
CCORE-1534
| When you enable the staging property in a mapping that reads multiple objects from a Google BigQuery V2 source, empty strings are written as null in the target.
(October 2022)
|
CR
| Description
|
---|---|
CCON-40658
| When you run a mapping in advanced mode to read DateTime data from a Google BigQuery source column of the Record data type with the
YYYY-MM-DD HH24:MI:SS.US format and write data to a Google BigQuery target column of the Record data type that contains data of the DateTime data type, the Secure Agent truncates the microsecond values and writes the DateTime values in the
YYYY-MM-DD HH24:MI:SS.MS format.
Google ticket reference number: 208344974
|
CCON-39566
| Mappings enabled with source or full pushdown that use a Google BigQuery V2 connection fail when you configure a Filter transformation with the following simple filter on a DateTime column in the Google BigQuery source and the DateTime column contains a value of 9999-12-31 23:59:59.999999:
col_dateTime > TO_DATE('2020-08-15 12:03:55.700000','YYYY-MM-DD HH24:MI:SS.US')
Google ticket reference number: 29137970
|
Updated March 09, 2023