Effective in version 10.2, the Data Integration Service automatically installs the Big Data Management binaries on the cluster.
When you run a mapping
, the Data Integration Service checks for the binary files on the cluster. If they do not exist or if they are not synchronized, the Data Integration Service prepares the files for transfer. It transfers the files to the distributed cache through the Informatica Hadoop staging directory on HDFS. By default, the staging directory is /tmp. This process replaces the requirement to install distribution packages on the Hadoop cluster.
For more information, see the
Informatica Big Data Management 10.2 Hadoop Integration Guide