application allows data analysts to create data preparation projects. Each step of the data preparation project is stored in a recipe that is translated into a mapping for execution on the Informatica platform.
When an analyst uploads data, the
Enterprise Data Lake Service
connects to the HDFS system in the Hadoop cluster to temporarily stage the data. When an analyst previews data, the
Enterprise Data Lake Service
connects to the Hadoop cluster to read from the Hive table.
You can create the
Enterprise Data Lake Service
when you install Enterprise Data Lake, or you can use the Administrator tool to create the service after installation.