Effective in version 10.4.0, you can you can use the following new data types for complex files:
When you run a mapping that reads or writes to Avro and Parquet complex file objects in the native environment or on the Hadoop environment, you can use the following data types:
Date
Decimal
Timestamp
You can use Time data type to read and write Avro or Parquet complex file objects in the native environment or on the Blaze engine.
You can use Date, Time, Timestamp, and Decimal data types are applicable when you run a mapping on the Databricks Spark engine.
The new data types are applicable to the following adapters:
PowerExchange for HDFS
PowerExchange for Amazon S3
PowerExchange for Google Cloud Storage
PowerExchange for Microsoft Azure Blob Storage
PowerExchange for Microsoft Azure Data Lake Storage Gen1
PowerExchange for Microsoft Azure Data Lake Storage Gen2
For more information about data types, see the "Data Type Reference" chapter in the