Table of Contents

Search

  1. Preface
  2. Introduction to Data Engineering Streaming
  3. Data Engineering Streaming Administration
  4. Sources in a Streaming Mapping
  5. Targets in a Streaming Mapping
  6. Streaming Mappings
  7. Window Transformation
  8. Appendix A: Connections
  9. Appendix B: Monitoring REST API Reference
  10. Appendix C: Sample Files

Microsoft Azure Data Lake Storage Gen1 Data Object

Microsoft Azure Data Lake Storage Gen1 Data Object

A Microsoft Azure Data Lake Storage Gen1 data object is a physical data object that represents data in a Microsoft Azure Data Lake Storage Gen1 table. After you create an Azure Data Lake Storage Gen1 connection, create a Microsoft Azure Data Lake Storage Gen1 data object write operation to write to a Microsoft Azure Data Lake Storage Gen1 table.
You can use Microsoft Azure Data Lake Storage Gen1 to store data irrespective of size, structure, and format. Use Microsoft Azure Data Lake Storage Gen1 to process large volumes of data to achieve faster business outcomes.
When you configure the data operation properties, specify the format in which the data object writes data. You can specify XML, JSON, or Avro as format. When you specify XML format, you must provide an XSD file. When you specify Avro format, provide a sample Avro schema in a .avsc file. When you specify JSON, you must provide a sample file.
You cannot run a mapping with a Microsoft Azure Data Lake Storage Gen1 data object on a MapR distribution.
You can pass any payload format directly from source to target in Streaming mappings. You can project columns in binary format pass a payload from source to target in its original form or to pass a payload format that is not supported.
Streaming mappings can read, process, and write hierarchical data. You can use array, struct, and map complex data types to process the hierarchical data. You assign complex data types to ports in a mapping to flow hierarchical data. Ports that flow hierarchical data are called complex ports.
For more information about processing hierarchical data, see the
Data Engineering Integration User Guide
.

0 COMMENTS

We’d like to hear from you!