Mapping fails if the temporary security credentials for Amazon S3 expire.
When you run a Databricks mapping and use the Permanent IAM credentials
authentication to stage data in Amazon S3, the mapping fails with the following
error if the temporary security credentials for Amazon S3
expire:
[ERROR]com.amazonaws.services.s3.model.AmazonS3Exception:
The provided token has expired. (Service: Amazon S3; Status Code: 400;
Error Code: ExpiredToken;
To troubleshoot this issue, set the
dbx.s3.sts.duration=7200
property either in the JVM options for the Secure Agent or in the custom
properties for the mapping task.
To configure the JVM option in the Secure Agent properties, perform the
following steps:
Log in to Informatica
Intelligent Cloud Services.
Select
Administrator
Runtime Environments
.
On the
Runtime
Environments
page, select the Secure Agent machine that
runs the mapping.
Click
Edit
.
In the
System
Configuration Details
section, select
Data
Integration Server
as the service and
DTM
as the type.
Edit the JVM option, and
set the value to
-Ddbx.s3.sts.duration=7200
.
Click
Save
.
To configure the custom properties for the mapping task, perform the following
steps:
In Data Integration, edit
the mapping task where you want to configure the custom property.
On the
Runtime
Options
tab, add the following property in the
Advanced
Session Properties
section:
Session Property
Name: Custom Properties
Session Property
Value: dbx.s3.sts.duration=7200
Save the mapping
task.
Incorrect data is written to a flat file for multi-line string fields in a
mapping enabled for SQL ELT optimization
When you run a mapping enabled for SQL ELT optimization to write multi-line
string fields to a flat file and the data is enclosed in double quotes, the data
in incorrectly written to the target.
To troubleshoot this issue, configure the
-DDatabricksMatchQuotesPastEndOfLine
JVM option value to
true
in the Secure Agent properties.
To configure the JVM option in the Secure Agent properties, perform the
following steps:
Log in to Informatica
Intelligent Cloud Services.
Select
Administrator
Runtime Environments
.
On the
Runtime
Environments
page, select the Secure Agent machine that
runs the mapping.
Click
Edit
.
In the
System
Configuration Details
section, select
Data
Integration Server
as the service and
DTM
as the type.
Edit the JVM option, and set
the value to
-DDatabricksMatchQuotesPastEndOfLine=true
.
Click
Save
.
The mapping fails with Read Timed Out error if the Databricks cluster is not up
and running and the Databricks connection fails to connect to the cluster
When the Databricks cluster is not up and running and the Databricks connection
fails to connect to the cluster, the mapping fails with Read Timed Out error
after approximately four and a half minutes.
To troubleshoot this issue, configure the
-DDatabricksRetryForClusterStart
JVM option value to
true
in the Secure Agent properties.
After you configure the JVM option, the Secure Agent retries the connection to
the Databricks SQL Warehouse cluster up to 3 times, with a 4.5-minute wait
between each attempt if the connection fails.
To configure the JVM option in the Secure Agent properties, perform the
following steps:
Log in to Informatica
Intelligent Cloud Services.
Select
Administrator
Runtime Environments
.
On the
Runtime
Environments
page, select the Secure Agent machine that
runs the mapping.