Release Notes for IDMC
- Release Notes for IDMC
- All Products
Issue
| Description
|
---|---|
CCON-106328 | When you run a mapping enabled for SQL ELT optimization to write
multi-line string fields to a flat file and the data is enclosed in
double quotes, the data in incorrectly written to the target.
(October 2025) |
CCON-105920 | When you run a Databricks mapping and use the Permanent IAM credentials
authentication to stage data in Amazon S3, the mapping fails with
the following error if the temporary security credentials for Amazon
S3 expire: [ERROR]
com.amazonaws.services.s3.model.AmazonS3Exception: The
provided token has expired. Service: Amazon S3; Status Code:
400; Error Code: ExpiredToken; (October
2025) |
Issue | Description |
---|---|
CCI-5192 | When you run a mapping in advanced mode to write data to multiple
Databricks targets and view the job results, the Individual Task
Results section doesn't display results for each target separately.
Instead, it displays a single consolidated result for all targets.
However, the data is correctly written to all targets. (April
2025) |
Issue
| Description
|
---|---|
CCON-109034 | When you stage data in a Volume in a mapping that writes to a
Databricks target, the mapping might fail with the following error
or result in partial data loss in the target table: Error
running query: [PATH_NOT_FOUND]
org.apache.spark.sql.AnalysisException: [PATH_NOT_FOUND]
Path does not exist: dbfs:/<VolumePath>. SQLSTATE:
42K03 (October 2025) Databricks ticket
number: 00710956 |
CCON-70459
| When you read more than 10 GB of data from Databricks, the mappings
intermittently fail with the following error: [ERROR]
Read from Delta table failed
Databricks ticket number: 2310230030003014
|