Informatica Data Quality
- Informatica Data Quality 10.5.1
- All Products
Issue
| Description
|
---|---|
MDX-22184
| You can use the Oracle Data Integrator scanner only with Oracle Knowledge Modules.
|
MDX-22942
| You cannot view detailed lineage for the Azure data flow in the Azure Data Factory.
|
MDX-23850
| Power Query processing fails for a report that you create from multiple Amazon S3 data sets belonging to different file paths. For example:
DataSet1: edc-qa-bucket/DemoCSV/CSVSubFolder/100 Sales Records.csv
DataSet2: edc-qa-bucket/Customer/ScannersQA/CSV/1000 Sales Records.csv
|
MDX-22476
| The DefaultValues parameter in Azure Data Factory does not process the section keys hierarchically.
|
MDX-24050
| You cannot view the detailed lineage for Databricks Notebooks when a connection name contains the following special characters:
|
MDX-23969
| When a Databricks Notebooks resource calls another Databricks Notebooks resource using the
%run <callee notebook> command, command links on Enterprise Data Catalog Search Results page do not open the Command page.
|
EIC-56548
| Automatic connection assignment for Teradata resource is degraded.
|
EIC-57275
| PowerCenter parameter file utility logs show the stack trace for the
file not found exception.
|
EIC-57245
| In lineage for Microsoft Azure Data Lake Store and Amazon S3 resources, a default icon displays for a folder asset type.
|
MDX-23693
| When you run an existing Advanced Scanners configuration after upgrade, you cannot see the processing information in the Advanced Scanners tool.
Workaround: You can view the processing information in the following directory: $<Informatica installation directory>/AdvScannersWorkspace/processings.
|
EIC-57422
| When you assign connections to multiple links, certain links incorrectly move to the
Auto Assigned Connections tab.
|
EIC-57352
| After you migrate a non-SSL domain to custom SSL, the Catalog Service fails with the following error:
Caused by: com.mongodb.MongoCommandException: Command failed with error 11 (UserNotFound): 'Could not find user .
Workaround: Perform the following steps after you migrate a non-SSL domain to custom SSL:
|
EIC-57025
| When you run the SAP Business Objects resource, the scanner logs contain MITI errors.
|
EIC-56989
| When you run the Google Big Query high volume resource, performance in the staging phase is slower.
|
EIC-56956
| When you run the SAP Business Objects resource, performance in the staging and metadata load phases is slower.
|
EIC-57474
| Rescanning of the JDBC resource fails after you select the empty case sensitivity option.
|
EIC-57394
| The catalog backup using the
infacmd ldm backupContents command fails if you have not set the
INFA_TRUSTSTORE and the
INFA_TRUSTSTORE environment variables. The error message that appears does not contain sufficient information.
|
EIC-57353
| If you have replaced the custom SSL certificates, the catalog restore fails.
Workaround: Enable the Informatica Cluster Service and the Catalog Service after you replaced the custom SSL certificates.
|
EIC-57375
| After you upgrade to version 10.5.1, the duplicate data domain groups are missing in the Enterprise Data Catalog tool.
|
EIC-57353
| After you replace the custom SSL certificates, the
infacmd ldm restoreContents command fails.
Workaround:
|
EIC-57308
| You cannot accept or reject data domains for assets of a custom resource in the data domain
Overview tab.
|
EIC-57274
| When the migration of the column similarity data fails, the migration log file does not display an error or warning message.
|
EIC-57269
| Enterprise Data Catalog tool displays an incorrect URL in the
System Attributes
section for an Amazon S3 parquet file.
|
EIC-57241
| After you upgrade the Catalog Service, assets from the Amazon S3 resource are missing in the Enterprise Data Catalog tool.
|
EIC-57185
| When you run a profile on a large dataset in a non-Hadoop environment, profiling job fails with the error message:
|
EIC-57134
| After you upgrade and re-index the Catalog Service, the duplicate foreign key constraint gets appended to the table.
|
EIC-57511
| After you upgrade from version 10.5 to version 10.5.1, the UpgradeJobs log location does not contain the log files of all resources.
|
EIC-57510
| If a resource name includes more than 280 characters, an error appears in the LDM_Upgrade.log file:
ERROR [pool-23-thread-1] - java.io.IOException: Directory '/data/Informatica/LDM1050/logs/node01/services/CatalogService/CSQEREPO/UpgradeJobs/ObjectSubscriberUpgradeModule/LONG_ESS_INTEGRATED_PAYMENT_SYSTEM_SQL_SERVER_DGD_EVENT_LOG_BILLING_IPS_RCC_GLSP3306_PROD_ESS_INTEGRATED_PAYMENT_SYSTEMS_SQLSERVER_DGD_EVENT_LOG_BILLING_IPS_BATCH_GLSP3306_1_PROD_GLSP3306_PROD_ESS_INTEGRATED_PAYMENT_SYSTEMS_SQLSERVER_DGD_EVENT_LOG_BILLING_IPS_BATCH_GLSP3306_1_PROD/255dcef7-cc92-4a89-9fb4-ca61d2274363/Re-Publish' could not be created
|
EIC-57336
| You cannot view the
Data Domain ,
Null|Distinct|Non-Distinct % , and
Source Data Type columns in
Overview tab of a Salesforce asset.
|
EIC-57549
| If a resource name contains more than 200 characters, you cannot download the log file for the resource.
|
EIC-57621
| After you upgrade from version 10.4.1 to version 10.5, the Enterprise Data Catalog tool does not display business glossary recommendations on the
Assets in the Resource page.
|
EIC-57623
| When a non-administrator user searches for an asset in the Enterprise Data Catalog tool, the
Search Results page fails to load.
|
EIC-57625
| When you export asset data from a resource to a Tableau Data Extract (TDE) file, the following error message appears:
|
MDX-23699
| You cannot use default values for pipelines and functions in the Azure Data Factory scanner. You can use default parameters only for activities.
|
MDX-23878
| When a Databricks Notebooks resource calls another Databricks Notebooks resource, the job fails with the following error:
Failed to resolve expression dbutils.notebook.run
|
MDX-23967
| Lineage for Databricks Notebooks fails when you launch the column-level lineage from the Delta file to the Delta table.
|
MDX-24043
| When you enter a database connection key for Databricks Notebooks, and the key contains an equal sign “=”, half of the key value shifts to the Resolved MDREPO entry field.
|
MDX-24178
| If a job that scans the Databricks Notebooks SQL contains a
create command, the job fails with the following error:
Unable to parse script
|
MDX-24161
| Databricks Notebooks scanner job fails because of a non-dependent MDREPO entry failure with the following error:
No suitable driver found
|
EIC-57495
| After you apply the Enterprise Data Catalog 10.5.1 hotfix to 10.5, 10.5.0.0.1, or 10.5.0.0.2 cumulative patch on a custom SSL enabled cluster and enable the Informatica Cluster Service, the client certificates are not copied to the cluster nodes.
Workaround: Perform the following steps:
|
EIC-57675
| Enabling Informatica Cluster Service hangs unexpectedly at the host type validation phase.
Workaround: Delete the contents of the
/tmp directory in the domain and cluster nodes, and then recycle the Informatica Cluster Service.
|
EIC-55684
| When a CSV file contains Dutch characters, the FileSystem scanner cannot detect column names. The resource fails with the following message:
Header not detected for file .
|
MDX-25099
| The Talend connector uses an incorrect account name to calculate the connection key for the Snowflake resource.
|