Data Services All Products
The Web Services Consumer transformation and the REST Web Services Consumer transformation do not support the Timestamp with Time Zone data type.
When you run multiple concurrent instances of the same workflow, the Mapping tasks might fail to update a persisted mapping output.
Workaround: Start the workflows with a ten second delay between them.
On AIX operating systems, when you use an SSL-enabled Oracle connection and the Oracle 12C client to connect to an Oracle database, the mapping fails.
If you assign a workflow variable to a Human task output, the Data Integration Service does not update the Human task output value when the Human task runs.
When you try to delete a parameter set that is part of an application or workflow, the Developer tool generates a null pointer exception and does not delete the parameter set.
You cannot embed single or double quotes in the infacmd dis updateParameterSetEntries command or the infacmd dis addParameterSetEntries command if you run either command on a Linux machine from the C shell.
Workaround: You can embed single or double quotes in either command if you run the command from the bash shell.
A mapping fails when the following circumstances are true:
Workaround: Do not change the data object type in a Read transformation if the mapping has a port list parameter or a sort list parameter in the parameter set.
When you include a Lookup transformation in a mapping, the Developer tool collapses the Lookup ports under the group name Lookup Columns. The editor does not show the links between the lookup ports and the downstream transformation because the lookup ports are not visible.
Workaround: In the Developer tool, click. Then click .
A mapping that performs single-source identity match analysis completes with errors when the following conditions are true:
Data preview fails for a Normalizer transformation with multiple occurring columns or records imported from PowerCenter.
Importing a mapping fails when you configure a the domain twice in the Developer tool.
When you change the default parameter value for the resource parameter on the
Data Objecttab, the
Data Objecttab does not show the correct default value. The
Data Objecttab shows the original default parameter value for the resource parameter.
Workaround: Browse for and select a different resource parameter on the
Data Objecttab. Then browse again and select the original resource parameter. The correct default parameter value appears.
When you run a data preview on a Lookup transformation with a customized data object lookup source, an unexpected error might occur if a table is deleted or replaced from the customized data object.
Workaround: Create a Lookup transformation using the modified customized data object as the lookup source.
When you view a historical version of a mapping that is not valid and then select
View optimized mapping, the Developer tool returns a null pointer exception. Close the error and open the
Validation Logview to view the problems with the mapping.
You cannot preview or run a mapping that contains a Java transformation with an unconnected output port of the Timestamp with Time Zone data type.
When you use parameters for the control file path and name in a flat file data object and you use the resource parameter for the flat file source in the mapping, the mapping fails.
When you switch between the
Parameteroption and the
Valueoption on the
Data Objecttab, the Developer tool opens the transformation
Generaltab after you choose a new data object value. The Developer tool should continue to show the
Workaround: Click the
Data Objecttab to view your changes.
If you select the ODBC provider as MongoDB and Cassandra to connect to the source, the Data Integration Service cannot push transformation logic to the source and results in a null pointer exception.
Workaround: Specify the ODBC provider in the ODBC connection object as
Otherand run the mapping.
The optimized mapping contains unconnected ports and the data preview fails when a mapping contains an output expression.
When you deploy a workflow that has a parameterized source and target, the workflow fails to create a target when the resource parameter is in an associated parameter set, and you run the workflow from the Developer tool.
Workaround: The first time you run the workflow, run the workflow using the infacmd wfs startWorkflow command. The next time you run the workflow you can run it from the Developer tool.
When you use a Decimal port with a precision of 38 digits in a mapping output expression, a Decimal overflow error occurs. The mapping does not fail.
Workaround: Set the mapping output data type to Double.
When you run multiple concurrent mappings from infacmd command line for a long time, the mapping run might fail with an error.
If the connection between the Model repository and the Subversion version control system is dropped during initial synchronization, an attempt to repeat the synchronization operation may fail with an error like:
This occurs when the Model Repository Service encounters a file that was already synchronized.
To respond to this problem, perform the following steps:
When you drag a port from a transformation with multiple port groups to another object in a mapping, the Developer tool does not display the port links. The issue arises in data quality transformations such as the Address Validator transformation and the Match transformation.
When the Data Integration Service applies the cost-based optimization method to a mapping with an Aggregator transformation, it might add an extra Sorter transformation even if the data is sorted before the Joiner transformation and the Aggregator transformation appears after the Joiner transformation.
When you run a midstream profile on the Router and matcher multiple-group transformations, the first group results are displayed for all the group results.
If you create a parameterized lookup source and you change the input column data type in the lookup condition, the Developer tool returns the following unexpected error message:
Workaround: To remove the message, press ESC to change the focus in the Developer tool user interface. Create another port with a valid data type for the lookup condition.
Unable to validate cell value.
When you create a logical data object mapping from a flat file data source, the mapping fails when you include a non-reusable Sequence Generator transformation. You can use a reusable Sequence Generator transformation.
When you create a port selector in a reusable transformation and you choose to select ports by name, the Developer tool does not list available ports.
When you do not specify the date format in the
Run Configurationsdialog box or when you do not specify the Timestamp with Time Zone formats in the target file, the Data Integration Service rejects the rows randomly during implicit conversion of a large data set.
Workaround: Verify that the data contains the date format specified at the
Run Configurationsand the Timestamp with Time Zone formats in the target file. You can use a data set with less than 100,000 rows.
Scorecard results do not appear when you create and run a new scorecard on a JSON or XML profile, and you receive a null pointer exception when you run the scorecard in the Analyst tool.
When the number of input rows is greater than 100,000 and the mapping contains a Java transformation with a Timestamp with Time Zone port, the mapping sometimes fails unexpectedly.
The Data Integration Service does not apply the cost-based optimization method to a mapping that contains an unspecified row limit or a LIMIT clause in the SQL transformation even if the mapping is configured to use the cost-based optimization method.
When you use the DATE_COMPARE(), GET_DATE_PART(), or LENGTH() function and enable full pushdown for a Teradata database, the Data Integration Service does not successfully push down the transformation logic. This issue occurs when you use an ODBC connection.
When you use an ODBC connection and write data to a Netezza target, the Data Integration Service rejects data of the Boolean and Timestamp data types.
When you parameterize a resource in a dynamic source and you choose to update the data object columns at run time, the Data Integration Service fails to resolve any system parameter that you configure in the source. The Data Integration Service might also fail to resolve a parameter for the flat file control file.
Workaround: Use a constant value instead of a system parameter for the source directory or the control file directory.
You cannot use the keyboard shortcut
Ctrl+Lto link ports.
Workaround: Use the mouse to drag a port from an input object or transformation to an output object or transformation.
You cannot bind a workflow parameter to a mapping parameter if the mapping parameter is one of the following parameter types: port, port list, sort list, expression, resource, or input linkset.
When the target for a Write transformation includes two database tables with a parent-child relationship, the mapping fails if you enable the option to
Create or replace table at run time. The Data Integration Service drops and recreates the tables in a specified order that prevents recreation of the correct primary key - foreign key relationship between the parent and child tables.
If the connection between the Model repository and the Perforce version control system is dropped during check in of multiple objects, some objects will not be checked in. After the connection is re-established, these objects still cannot be checked in, because the Perforce workspace is corrupted when the connection drops.
To respond to this problem, perform the following steps:
When you restart the Model Repository Service, the service automatically recreates the Perforce workspace. You can check in the checked out files and perform other version control system-related operations.
When a mapping enabled for partitioning contains a Normalizer transformation, the Data Integration Service always uses one thread to run the transformation. The Data Integration Service can use multiple threads to run the remaining mapping pipeline stages.
Default value does not always appear for the Timestamp with Time Zone input port in the testing panel of the Expression Editor.
Workaround: Verify that the source data contains the following format for Timestamp with Time Zone:
MM/DD/YYYY HH24:MI:SS TZR
On AIX 6.1, a mapping fails with an unexpected condition when the mapping contains a Timestamp with Time Zone data type.
The Data Integration Service does not apply the cost-based optimization method to the mapping that contains a Timestamp with Time Zone data type even if the mapping is configured with the full optimizer level.
If the disk where the version control system stores Model repository objects runs out of space during a version control action, the action fails, and you may see a message that says the connection was lost. If an administrator attempts the same action in the Administrator tool, the repository logs have the correct error message about the disk space problem.
When you use Timestamp with Time Zone data type in the mapping, the data gets truncated if the precision exceeds seconds. The issue occurs when you enable data object caching on the logical data object mappings and the data object caching database is on IBM DB2 or Microsoft SQL Server.
Nanoseconds are ignored for Timestamp with Time Zone data in the expression result at the bottom of the testing panel in the Expression Editor.
The Developer tool does not display the tabs in the Properties view of transformations after you preview data.
Workaround: Click an empty area in the mapping editor and then select the transformation to view the tabs in the Properties view.
When you configure a mapping that contains a TO_BIGINT function and the function converts decimal values to bigint values for pushdown optimization, the mapping writes incorrect data to the target.
Workaround: Do not configure pushdown optimization for the mapping and run the mapping again.
When you export a mapping with a parameterized source or target and you import it into another project, the mapping fails. The issue occurs because resource parameter default value references the original project name in the path.
Workaround: Update the resource parameter default value after you import the mapping.
When Data Transformation cannot process JSON or XML input files, the profile run fails.
The Match transformation performs identity analysis on the key field port and ignores the match strategy ports when you do not include the key field port in the match strategy. The transformation does not perform match analysis correctly on the key field data and does not create accurate clusters.
When you run an identity match mapping that performs dual-source analysis on columns with different names, the Match transformation fails and generates an error message. For example, the Match transformation fails when you compare a ZIP Code column in one data source with a Postcode column in another data source.
Cannot import a mapping with a Stored Procedure transformation from PowerCenter into the Developer tool.
When you parameterize a lookup source and an input port has a name conflict with a port in the lookup source, the Developer tool renames one of the ports. If the Developer tool renames a lookup port, the Developer tool issues a warning to change the port name. If you do not resolve the name conflict in the Developer tool, the transformation is valid, but you might receive unexpected results when you run the dynamic mapping.
Workaround: Change the name of the port in the lookup source to avoid a name conflict.
Column profile run fails when the following conditions are true:
You cannot create some types of parameters on a mapping
Parameterstab if the mapping does not contain a transformation that supports the parameter type. The Developer tool shows a list of parameter types that includes just the parameter types available for the transformations in the mapping. For example, you cannot create a sort list parameter in the
Parameterstab unless the mapping contains a Sorter transformation.
Workaround: Add the transformation to the mapping before you create the mapping parameter.
The Developer tool
Progressview shows tasks in random order.
Expression format validation fails for the Timestamp with Time Zone functions: CREATE_TIMESTAMP_TZ, GET_TIMEZONE, GET_TIMESTAMP, and TO_TIMESTAMP_TZ.
You cannot use the keyboard to add an HTTP web connection.
Workaround: Use the mouse to add an HTTP web connection.
You cannot use the keyboard to add a web service connection.
Workaround: Use the mouse to add a web service connection.
A validated mapping fails to run with an expression parsing error because an expression contains Unicode punctuation characters in field names.
The Data Integration Service does not apply the cost-based optimization method when you configure the mapping to use load order constraints with the full optimizer level.
You cannot copy fields to the Ports view of a REST Web Service Consumer transformation.
Workaround: Manually add the ports to the REST Web Service Consumer transformation.
No validation error occurs if you create a workflow parameter name with a leading dollar sign ($).
Expression validation fails for dynamic expressions with functions that require arguments of a specific data type. For example, a
REVERSE()function fails to validate because it requires an argument of CHAR data type.
Workaround: Use a conversion function in the dynamic expression to specify the data type. For example, add the dynamic port within a
You cannot specify a Timestamp with Time Zone data type with a time zone region in Daylight Savings Time (TZD) format.
When you configure a lookup condition on a Decimal (38,38) column of an SAP HANA source, the data preview fails and the mapping terminates.
You cannot use a delimiter other than a colon when specifying the time zone offset with the Timestamp with Time Zone data type.
Workaround: Change the delimiter to a colon for the time zone offset for the Timestamp with Time Zone data type.
A mapping that reads from a flat file source might not be fully optimized at run time when the following conditions are true:
Workaround: Configure the flat file data object to use a string value or a user-defined parameter for the source file directory.
The Developer Tool generates columns incorrectly from a control file that contains the Binary and Timestamp with Time Zone data types.
Workaround: Set the precision and scale to (36,9) for the Timestamp with Time Zone data type in the control file. Remove the columns containing the binary data type from the control file.
When you copy a transformation that has parameters, the Developer tool does not include the parameters in the copy of the transformation. This issue also occurs when you copy a mapping that contains the transformation with parameters.
When you create an Oracle connection with a case-sensitive user name, the Developer tool does not display the default schema.
Unable to read SAP HANA data for the columns of the Decimal data type with precision from 35 digits to 38 digits.
The infacmd AddParameterSetEntries command fails if you run the infacmd DeleteParameterSetEntries command immediately after it and then repeat the commands multiple times.
If you do not define the input schema or map the request input group element to the root element of REST consumer input, REST fails without displaying an error message.
When adding custom ports, non-reusable REST transformation incorrectly appends new custom ports to deleted custom ports.
Workaround: Recreate the transformation.
You cannot use a non-reusable Sequence Generator transformation in a mapplet or a logical data object mapping. However, the Developer tool does not show any validation or run-time error if you copy a non-reusable Sequence Generator transformation from a mapping and paste it into a mapplet or a logical data object mapping.
When you use the ABORT() function in an Expression transformation, the Data Integration Service does not process the Expression transformation.
Workaround: Change the default value of the output port to 0 and run the mapping again.
A partitioned mapping fails if you use the default merge file name to sequentially merge the target output for all partitions.
Workaround: Change the default name of the merge file.
When you run data preview on an Oracle table with a native SSL connection or you run a mapping that has an Oracle data object with a native SSL connection, the Developer tool shuts down unexpectedly.
If you create a probabilistic model that contains multibyte data values, the values can break across lines in the Data view of the model. The issue arises if you resize the Developer tool views so that a data value moves from one line to another in the Data view. If you assign a label to a data value that breaks over two lines, the label might not attach to the correct value. The value that you label might overwrite another value in the Data view.
Workaround: Resize the Developer tool views so that the data values do not break in the Data view.
When the Data Integration Service performs a cached lookup and an uncached lookup on Microsoft SQL Server Uniqueidentifier data types, it does not return the same number of rows.
When an SQL data service query generates a long WHERE clause, pushdown to the source fails. For example, if an SQL query generates a WHERE clause of 61 KB or higher, pushdown to source fails.
Workaround: You can reduce the optimizer level for the query or increase memory for the JVM that runs the Data Integration Service.
When a mapping contains multiple Match transformations, any change to the settings one of the Match transformations might affect the settings in another Match transformation. The issue occurs when the following conditions are true:
Workaround: Reconfigure the affected Match transformation.
The Key Generator transformation cannot generate unique sequence ID values in a Hadoop environment.