Table of Contents

Search

  1. Preface
  2. Part 1: Version 10.5.5
  3. Part 2: Version 10.5.4 - 10.5.4.x
  4. Part 3: Version 10.5.3 - 10.5.3.x
  5. Part 4: Version 10.5.2 - 10.5.2.1.x
  6. Part 5: Version 10.5.1 - 10.5.1.1
  7. Part 6: Versions 10.5 - 10.5.0.1
  8. Part 7: Versions 10.4.1 - 10.4.1.3
  9. Part 8: Versions 10.4 - 10.4.0.2
  10. Part 9: Versions 10.2.2 - 10.2.2 HotFix 1
  11. Part 10: Version 10.2.1
  12. Part 11: Version 10.2 - 10.2 HotFix 2

What's New and Changed (10.5.5)

What's New and Changed (10.5.5)

Data Preview

Data Preview

Effective in version 10.4.0, the Data Integration Service uses Spark Jobserver to preview data on the Spark engine. Spark Jobserver allows for faster data preview jobs because it maintains a running Spark context instead of refreshing the context for each job. Mappings configured to run with Amazon EMR, Cloudera CDH, and Hortonworks HDP use Spark Jobserver to preview data.
Previously, the Data Integration Service used spark-submit scripts for all data preview jobs on the Spark engine. Mappings configured to run with Azure HDInsight and MapR use spark-submit scripts to preview data on the Spark engine. Previewing data on mappings configured to run with Azure HDInsight and MapR is available for technical preview.
For more information, see the "Data Preview" chapter in the
Data Engineering Integration 10.4.0 User Guide
.

Union Transformation

Effective in version 10.4.0, you can choose a Union transformation as the preview point when you preview data. Previously, the Union transformation was not supported as a preview point.

infacmd dp Commands

You can use the infacmd dp plugin to perform data preview operations. Use infacmd dp commands to manually start and stop the Spark Jobserver.
The following table describes infacmd dp commands:
Command
Description
startSparkJobServer
Starts Spark Jobserver on the Integration Service machine.
By default, the Spark Jobserver starts when you preview hierarchical data.
stopSparkJobServer
Stops the Spark Jobserver running on specified Integration Service.
By default, the Spark Jobserver stops if it is idle for 60 minutes or when the Data Integration Service is stopped or recycled.
For more information, see the "infacmd dp Command Reference" chapter in the
Informatica 10.4.0 Command Reference
.

0 COMMENTS

We’d like to hear from you!