Table of Contents

Search

  1. Abstract
  2. Informatica Installation
  3. Informatica Upgrade
  4. 10.1 Fixed Limitations and Closed Enhancements
  5. 10.1 Known Limitations
  6. Informatica Global Customer Support

Mapping and Workflow Known Limitations

The following table describes known limitations:
CR
Description
461315
When you perform a search and replace operation on a date-time value in an exception task, the Analyst tool does not update the date-time value with the value that you entered. The issue arises when the Analyst tool browser uses an English (United Kingdom) locale.
Workaround: Set the browser locale to English (United States).
460888
You cannot validate a workflow that contains multiple Mapping tasks if you replace the data target in a mapping that one of the tasks uses.
460871
The search and replace options in an exception task do not recognize Float data types as numeric data when the task is in a Microsoft SQL Server database.
460729
You cannot use the All Numbers option to replace a numeric value on all pages of exception task data in a Microsoft SQL Server database.
Workaround: Select a column of numeric data. Replace the numeric value in the column that you select on all pages of the task. Repeat the process for any other column of numeric data.
460715
You cannot search and replace data in a column in an exception task if the workflow that created the task used the column data to distribute task instances to users.
459911
If you create the workflow database contents on a Data Integration Service on grid and the Data Integration Service stops unexpectedly, the Administrator tool displays the following message:
Workflow database contents do not exist.
The issue arises when you enable operating system profiles on the Data Integration Service.
459791
Workflow log file names do not include a time stamp.
459488
If a Teradata or Netezza mapping contains an SQL query that reads decimal columns with precision less than 15, the mapping fails to run after you import the mapping from PowerCenter to the Developer tool.
458284
You cannot deploy a workflow application if the workflow contains more than 12 Mapping tasks between two Inclusive gateways.
457765
When you run a workflow with a Mapping task under an operating system profile, the workflow does not create the directories that the operating system profile specifies. In addition, the Mapping task fails to run.
457624
You cannot use the Scheduler to run mappings, workflows or any other job in a domain where Kerberos authentication is enabled.
456589
When a workflow that includes a Command task and a Human task recovers from a Data Integration Service interruption, the workflow monitor might not show the correct state of the Command task. The workflow monitor might show that the Command task is running, although the task restarted and completed on recovery.
The issue arises when the following conditions are true:
  • The workflow runs the Command task and the Human task in parallel between two Inclusive gateways.
  • The Human task generates a large number of task instances, for example 600 task instances.
443810
When you run multiple concurrent instances of the same workflow, the Mapping tasks might fail to update a persisted mapping output.
Workaround: Start the workflows with a ten second delay between them.
443730
On AIX operating systems, when you use an SSL-enabled Oracle connection and the Oracle 12C client to connect to an Oracle database, the mapping fails.
442040
If you select the ODBC provider as MongoDB and Cassandra to connect to the source, the Data Integration Service cannot push transformation logic to the source and results in a null pointer exception.
Workaround: Specify the ODBC provider in the ODBC connection object as
Other
and run the mapping.
440849
When the Data Integration Service applies the cost-based optimization method to a mapping with an Aggregator transformation, it might add an extra Sorter transformation even if the data is sorted before the Joiner transformation and the Aggregator transformation appears after the Joiner transformation.
440275
The Data Integration Service does not apply the cost-based optimization method to a mapping that contains an unspecified row limit or a LIMIT clause in the SQL transformation even if the mapping is configured to use the cost-based optimization method.
439979
When you use an ODBC connection and write data to a Netezza target, the Data Integration Service rejects data of the Boolean and Timestamp data types.
439220
When the target for a Write transformation includes two database tables with a parent-child relationship, the mapping fails if you enable the option to
Create or replace table at run time
. The Data Integration Service drops and recreates the tables in a specified order that prevents recreation of the correct primary key - foreign key relationship between the parent and child tables.
431685
A validated mapping fails to run with an expression parsing error because an expression contains Unicode punctuation characters in field names.
429231
No validation error occurs if you create a workflow parameter name with a leading dollar sign ($).
426806
A mapping that reads from a flat file source might not be fully optimized at run time when the following conditions are true:
  • The flat file data object uses the SourceDir system parameter for the source file directory.
  • The mapping runs on a Data Integration Service grid configured to run jobs in separate remote processes.
Workaround: Configure the flat file data object to use a string value or a user-defined parameter for the source file directory.
393416
A partitioned mapping fails if you use the default merge file name to sequentially merge the target output for all partitions.
Workaround: Change the default name of the merge file.
375473
When an SQL data service query generates a long WHERE clause, pushdown to the source fails. For example, if an SQL query generates a WHERE clause of 61 KB or higher, pushdown to source fails.
Workaround: You can reduce the optimizer level for the query or increase memory for the JVM that runs the Data Integration Service.