Table of Contents


  1. Preface
  2. Introduction to Informatica Big Data Management
  3. Connections
  4. Mappings in the Hadoop Environment
  5. Mapping Objects in the Hadoop Environment
  6. Monitoring Mappings in the Hadoop Environment
  7. Mappings in the Native Environment
  8. Profiles
  9. Native Environment Optimization
  10. Data Type Reference
  11. Function Reference
  12. Parameter Reference
  13. Multiple Blaze Instances on a Cluster

Hive Targets

Hive Targets

A mapping that is running in the Hadoop environment can write to a Hive target.
Consider the following limitations when you configure a Hive target in a mapping that runs in the Hadoop environment:
  • The Data Integration Service does not run pre-mapping or post-mapping SQL commands against a Hive target. You cannot validate and run a mapping with PreSQL or PostSQL properties for a Hive target.
  • A mapping fails to run if the Hive target definition differs in the number and order of the columns from the relational table in the Hive database.
  • A mapping fails to run when you use Unicode characters in a Hive target definition.
  • You must truncate the target table to overwrite data to a Hive table with Hive version 0.7. The Data Integration Service ignores write, update override, delete, insert, and update strategy properties when it writes data to a Hive target.
  • The Data Integration Service can truncate the partition in the Hive target in which the data is being inserted. You must choose to both truncate the partition in the Hive target and truncate the target table.
In a mapping that runs on the Spark engine or the Blaze engine, you can create a custom DDL query that creates or replaces a Hive table at run time. However, with the Blaze engine, you cannot use a backtick (`) character in the DDL query. The backtick character is required in HiveQL when you include special characters or keywords in a query.
When a mapping creates or replaces a Hive table, the type of table that the mapping creates depends on the run-time engine that you use to run the mapping.
The following table shows the table type for each run-time engine:
Run-Time Engine
Resulting Table Type

Updated July 03, 2018