to learn to create Informatica mappings that read data from cloud or on-premises data sources, perform calculations and transformation on the data in a native, Hadoop or Databricks environment, and write results to S3, HDFS, Hive, Azure data lake, or other data store formats. Learn to configure mappings to run using the Spark engine or the proprietary Blaze engine. Learn to create workflows that deploy several mappings to a cluster.