Table of Contents

Search

  1. Preface
  2. Part 1: Version 10.5.6 - Version 10.5.6.x
  3. Part 2: Version 10.5.5 - 10.5.5.x
  4. Part 3: Version 10.5.4 - 10.5.4.x
  5. Part 4: Version 10.5.3 - 10.5.3.x
  6. Part 5: Version 10.5.2 - 10.5.2.1.x
  7. Part 6: Version 10.5.1 - 10.5.1.1
  8. Part 7: Versions 10.5 - 10.5.0.1
  9. Part 8: Versions 10.4.1 - 10.4.1.3
  10. Part 9: Versions 10.4 - 10.4.0.2
  11. Part 10: Versions 10.2.2 - 10.2.2 HotFix 1
  12. Part 11: Version 10.2.1
  13. Part 12: Version 10.2 - 10.2 HotFix 2

What's New and Changed (10.5.6.1)

What's New and Changed (10.5.6.1)

New Data Types Support

New Data Types Support

Effective in version 10.4.0, you can you can use the following new data types for complex files:
  • When you run a mapping that reads or writes to Avro and Parquet complex file objects in the native environment or on the Hadoop environment, you can use the following data types:
    • Date
    • Decimal
    • Timestamp
  • You can use Time data type to read and write Avro or Parquet complex file objects in the native environment or on the Blaze engine.
  • You can use Date, Time, Timestamp, and Decimal data types are applicable when you run a mapping on the Databricks Spark engine.
The new data types are applicable to the following adapters:
  • PowerExchange for HDFS
  • PowerExchange for Amazon S3
  • PowerExchange for Google Cloud Storage
  • PowerExchange for Microsoft Azure Blob Storage
  • PowerExchange for Microsoft Azure Data Lake Storage Gen1
  • PowerExchange for Microsoft Azure Data Lake Storage Gen2
For more information about data types, see the "Data Type Reference" chapter in the
Data Engineering Integration 10.4.0 User Guide
.

0 COMMENTS

We’d like to hear from you!