Rules and guidelines for JDBC V2 objects in mappings
Rules and guidelines for JDBC V2 objects in mappings
Consider the following rules and guidelines for JDBC V2 objects used as sources
and targets
in mappings:
Schema names and table names
If you change the schema name in the connection, the updated schema name does not reflect in the user interface in the existing mapping object. However, the updated schema reflects at runtime.
In a mapping that uses the
Create Target
option, you can't override the schema name and table name.
The mapping fails when the target table name contains double quotes.
Import objects
When you import a JDBC V2 object and search for special characters, the search results do not display the objects.
Multiple objects
You can't parameterize multiple JDBC V2 source objects.
When you join multiple Azure SQL Database tables, the schemas must have different table names.
Data types
When you preview the data for time(4), time(5) and time(6) data types, the data is truncated beyond precision 3.
Do not specify a filter in a mapping for the Azure SQL Database source that contains the Datetime data type.
When you read
or write
data that contains time(4), time(5), and time(6) data types, the data is truncated beyond precision 3.
When you write data that contains the Timestamp data type to a PostgreSQL or Azure SQL Database target using the
Create New at Runtime
option, the Timestamp data type value is appended with the time zone.
When you create a new Azure SQL or PostgreSQL Database target at runtime and the source data contains the Time data type, the Secure Agent writes the date time value until microseconds.
Partitioning
You can't configure partitioning in a mapping.
Salesforce Data Cloud read
When you read data from Salesforce Data Cloud and the source contains Date or Datetime data type, the mapping stops responding. To resolve the issue, change the Date or Datetime datatype fields to String in the source object and in the corresponding field mapping.
SQL query
Ensure that the list of selected columns, the data types, and the order of the columns that appear in the query matches the columns, data types, and order in which they appear in the source object.
You can't use an SQL query to read data from a Salesforce data lake.