Databricks delta table as streaming mapping target
For Data Engineering Streaming, you can use Databricks delta table as a target of streaming mapping for the ingestion of streaming data.
Dynamic streaming mapping
You can configure dynamic streaming mappings to change Kafka sources and targets at run time based on the parameters and rules that you define in a Confluent Schema Registry.
Hive Warehouse Connector and Hive LLAP
For Data Engineering products, use the Hive Warehouse Connector and Hive LLAP with Azure HDInsight 4.x and Hortonworks HDP 3.x clusters to enable Spark code to interact with Hive tables and allow ACID-enabled Hive tables on the Spark engine.
For more information, see the "Mapping Optimizations" chapter in the
Data Engineering Integration 10.4.1 User Guide
Snowflake as a streaming mapping target
For Data Engineering Streaming, you can configure Snowflake as a target in a streaming mapping to write data to Snowflake.