Table of Contents

Search

  1. Preface
  2. Introduction to Transformations
  3. Transformation Ports
  4. Transformation Caches
  5. Address Validator Transformation
  6. Aggregator Transformation
  7. Association Transformation
  8. Bad Record Exception Transformation
  9. Case Converter Transformation
  10. Classifier Transformation
  11. Comparison Transformation
  12. Consolidation Transformation
  13. Data Masking Transformation
  14. Data Processor Transformation
  15. Decision Transformation
  16. Duplicate Record Exception Transformation
  17. Expression Transformation
  18. Filter Transformation
  19. Hierarchical to Relational Transformation
  20. Java Transformation
  21. Java Transformation API Reference
  22. Java Expressions
  23. Joiner Transformation
  24. Key Generator Transformation
  25. Labeler Transformation
  26. Lookup Transformation
  27. Lookup Caches
  28. Dynamic Lookup Cache
  29. Match Transformation
  30. Match Transformations in Field Analysis
  31. Match Transformations in Identity Analysis
  32. Normalizer Transformation
  33. Merge Transformation
  34. Parser Transformation
  35. Python Transformation
  36. Rank Transformation
  37. Read Transformation
  38. Relational to Hierarchical Transformation
  39. REST Web Service Consumer Transformation
  40. Router Transformation
  41. Sequence Generator Transformation
  42. Sorter Transformation
  43. SQL Transformation
  44. Standardizer Transformation
  45. Union Transformation
  46. Update Strategy Transformation
  47. Web Service Consumer Transformation
  48. Parsing Web Service SOAP Messages
  49. Generating Web Service SOAP Messages
  50. Weighted Average Transformation
  51. Window Transformation
  52. Write Transformation
  53. Appendix A: Transformation Delimiters

Developer Transformation Guide

Developer Transformation Guide

Advanced Properties

Advanced Properties

Configure advanced properties to determine how the Data Integration Service processes data for the Write transformation.
Configure the following properties on the
Advanced
tab:
Tracing level
Control the amount of detail in the mapping log file.
Target load type
Type of target loading. Select Normal or Bulk. You can set the target load type for relational resources or customized data objects.
If you select Normal, the Data Integration Service loads targets normally. You can choose Bulk when you load to DB2, Sybase, Oracle, or Microsoft SQL Server. If you specify Bulk for other database types, the Data Integration Service reverts to a normal load. Bulk loading can increase mapping performance, but it limits the ability to recover because no database logging occurs. When you write to an Oracle target with bulk loading, you can optimize performance by disabling constraints in the Oracle database.
Choose Normal mode if the mapping contains an Update Strategy transformation. If you choose Normal and the Microsoft SQL Server target name includes spaces, configure the following environment SQL in the connection object:
SET QUOTED_IDENTIFIER ON
Update override
Overrides the default UPDATE statement for the target.
Delete
Deletes all rows flagged for delete.
Default is enabled.
The Databricks Spark engine ignores this property. To delete rows, use an Update Strategy transformation.
Insert
Inserts all rows flagged for insert.
Default is enabled.
The Databricks Spark engine ignores this property. To insert rows, use an Update Strategy transformation.
Target Schema Strategy
Type of target schema strategy for the relational or Hive target table.
You can select one of the following target schema strategies:
  • RETAIN - Retain existing target schema. The Data Integration Service retains the existing target schema.
  • CREATE - Create or replace table at run time. The Data Integration Service drops the target table at run time and replaces it with a table based on a target table that you identify.
  • Assign Parameter. You can assign a parameter to represent the value for the target schema strategy and then change the parameter at run time.
DDL Query to create or replace
Creates or replaces the target table at run time based on a DDL query that you define. Applicable when you select
CREATE - Create or replace table at run time
target schema strategy option.
Truncate target table
Truncates the target before it loads data.
Default is enabled.
Truncate target partition
Truncates an internal or external partitioned Hive target before it loads data. You must choose
Truncate target table
before you choose this option.
Default is disabled.
Update strategy
Update strategy for existing rows. You can select one of the following strategies:
  • Update as update. The Data Integration Service updates all rows flagged for update.
  • Update as insert. The Data Integration Service inserts all rows flagged for update. You must also select the
    Insert
    target option.
  • Update else insert. The Data Integration Service updates rows flagged for update if they exist in the target and then inserts any remaining rows marked for insert. You must also select the
    Insert
    target option.
PreSQL
SQL command the Data Integration Service runs against the target database before it reads the source.
The Developer tool does not validate the SQL.
PostSQL
SQL command that the Data Integration Service runs against the target database after it writes to the target.
The Developer tool does not validate the SQL.
Maintain row order
Maintain the row order of the input data to the target. Select this option if the Data Integration Service should not perform any optimization that can change the row order.
When the Data Integration Service performs optimizations, it might lose the row order that was established earlier in the mapping. You can establish row order in a mapping with a sorted flat file source, a sorted relational source, or a Sorter transformation. When you configure a target to maintain row order, the Data Integration Service does not perform optimizations for the target.
Constraints
SQL statements for table-level referential integrity constraints. Applies to relational targets only.

0 COMMENTS

We’d like to hear from you!