Table of Contents

Search

  1. Abstract
  2. Informatica 10.2.2 HotFix 1 Service Pack 2 Installation
  3. Emergency Bug Fixes Merged into 10.2.2 HotFix 1 Service Pack 2
  4. 10.2.2 HotFix 1 Service Pack 2 Fixed Limitations
  5. 10.2.2 HotFix 1 Service Pack 2 Known Limitations
  6. 10.2.2 HotFix 1 Service Pack 1 Fixed Limitations
  7. 10.2.2 HotFix 1 Service Pack 1 Known Limitations
  8. 10.2.2 HotFix 1 Fixed Limitations
  9. 10.2.2 HotFix 1 Known Limitations
  10. 10.2.2 Service Pack 1 Fixed Limitations
  11. 10.2.2 Service Pack 1 Known Limitations
  12. 10.2.2 Fixed Limitations and Closed Enhancements
  13. 10.2.2 Known Limitations
  14. Informatica Global Customer Support

Big Data Release Notes (10.2.2 HotFix 1 Service Pack 2)

Big Data Release Notes (10.2.2 HotFix 1 Service Pack 2)

Big Data Management Known Limitations (10.2.2 HotFix 1)

Big Data Management Known Limitations (10.2.2 HotFix 1)

The following table describes known limitations:
Bug
Description
BDM-26248
If you specify a compression codec in a custom query, the Blaze engine fails to compress HDFS files using the codec on every Hadoop distribution except Hortonworks HDP 3.1.
BDM-26206
A mapping with flat file sources and targets that uses the Spark engine to run on a WANdisco-enabled Hortonworks HDP 2.6.5 cluster fails.
Workaround:
  1. Copy the following .jar files from the cluster in the
    usr/hdp/<version>/hadoop/client
    directory to each node where the Informatica Data Integration Service is installed to the
    <Informatica home>/services/shared/spark/lib_spark_<version>
    directory:
    • hadoop-common-x.x.x.jar
    • hadoop-auth-x.x.x.jar
    where "x.x.x" is the .jar file version that WANdisco uses.
    For example, find the file hadoop-common-2.7.3.2.6.5.0-292.jar in the
    /usr/hdp/2.6.5.0-292/hadoop/client
    directory on the cluster.
  2. Restart the Data Integration Service on each node.
  3. On the cluster node where HDFS sources/targets reside, grant 733 permisson to HDFS.
BDM-25475
When you delete contents for Erasure Coding (EC) within the cluster staging directory and run the mapping, the mapping might fail when the auto installer copy sometimes fails to create the XOR codec for raw EC.

0 COMMENTS

We’d like to hear from you!