- Common Content for Data Engineering 10.2.2 Service Pack 1
- All Products
This document contains important information about restricted functionality, known limitations, and bug fixes in Informatica 10.2.2 Service Pack 1.
The Informatica New Features and Enhancements Guide is written for all Informatica software users. This guide lists the new features and enhancements in Informatica products.
The Informatica Release Guide lists new features and enhancements, behavior changes between versions, and tasks you might need to perform after you upgrade from a previous version. The Informatica Release Guide is written for all types of users who are interested in the new features and changed behavior. This guide …
The Informatica Developer Transformation Guide contains information about transformation functionality in the Developer tool. It is written for data quality, big data, and data services developers. This guide assumes that you have an understanding of data quality concepts, flat file and relational database concepts, and the database engines in …
The Informatica Web Services Guide is written for data quality and data services developers. This guide assumes that you have an understanding of web services concepts.
The Informatica Application Service Guide is written for Informatica users who need to configure application services. The Informatica Application Service Guide assumes you have basic working knowledge of Informatica and details of the environment in which the application services run.
The Informatica Command Reference is written for Informatica administrators and developers who manage the repositories and administer the domain and services. This guide assumes you have knowledge of the operating systems in your environment. This guide also assumes you are familiar with the interface requirements for the supporting applications.
The Informatica Developer Transformation Language Reference is written for the developers who are responsible for building mappings. The Informatica Developer Transformation Language Reference assumes you have knowledge of SQL, relational database concepts, and the interface requirements for your supporting applications.
The Informatica Installation and Configuration Guide is written for the system administrator who is responsible for installing the Informatica product. This guide assumes you have knowledge of operating systems, relational database concepts, and the database engines, flat files, or mainframe systems in your environment. This guide also …
The Big Data Management™ Administrator Guide is written for Informatica administrators. The guide contains information that you need to administer the integration of the Informatica domain with the compute clusters in non-native environments. It includes information about security, connections, and cluster configurations. This guide assumes that …
The Informatica Big Data Management™ Integration Guide is written for the system administrator who is responsible for integrating the native environment of the Informatica domain with a non-native environment, such as Hadoop or Databricks. This guide contains instructions to integrate the Informatica and non-native environments. Integration …
You can integrate a Model repository with a Perforce, Subversion, or Git version control system. This article discusses how to integrate a Git system with a Model Repository Service in 10.2 HotFix 1.
A parameter file is an .xml file that lists user-defined parameters and their assigned values. Parameter files provide the flexibility to change parameter values each time that you run a mapping or a workflow. Generate a parameter file based on a mapping or workflow using the Developer tool or the command line. Edit the contents of the file …
You can create an Informatica REST web service that returns data to a web service client in JSON or XML format. The article explains how to define a REST web service in the Developer tool. The REST web service runs a mapping that returns hierarchical data in JSON format to a web service client browser.
You can enable users to log into Informatica web applications using single sign-on. This article explains how to configure single sign-on in an Informatica domain using Security Assertion Markup Language (SAML) v2.0 and the Citrix NetScaler 13.0 identity provider.
You can use a workflow to automate the creation of a cluster on supported cloud platforms. The workflow creates a cluster and runs mappings and other workflow tasks. When you include a Delete Cluster task, so that the cluster terminates when workflow tasks are complete, the cluster is known as an ephemeral cluster.
When you create a Microsoft SQL Server connection, you can use the OLE DB or ODBC provider types. If required, you can migrate the OLE DB provider type to the ODBC provider type. This article explains how to migrate Microsoft SQL Server connections from the OLE DB provider type to the ODBC provider type.
An operating system profile is a type of security that the Data Integration Service uses to run mappings. You can define an operating system profile as a user to run mappings. Use operating system profiles to increase security and to isolate the run-time environment for users.
You can run SQL queries against a relational database midstream in a mapping. This article describes how to configure an SQL transformation in a logical data object mapping in the Developer tool.
You can take advantage of cloud computing efficiencies and power by deploying the Informatica Big Data Management solution in the Microsoft Azure environment. You can use a hybrid solution to offload or extend on-premises applications to the cloud. You can also use a lift-and-shift strategy to move an existing on-premises big data solution …
Follow the steps to install Python on each Data Integration Service machine so that the Spark engine can run the Python transformation. This article uses Python 3.6.5 on Cloudera CDH 6.1, but you can follow similar steps for other Python and Hadoop distributions such as Amazon EMR.
The Model repository is a relational database that contains metadata about connections, applications and workflows, transformations and functions, and other objects. You can perform several tasks to improve the performance of the Model repository and its interactions with other Informatica services, and with databases and external clients.
This article provides standardized naming conventions for repository objects. Naming conventions improve readability for anyone reviewing or carrying out maintenance on repository objects. The application and enforcement of naming standards establishes consistency in the repository and creates a developer-friendly environment. In addition, …
The Data Integration Service runs concurrent web service requests according to the properties that you configure on the Data Integration Service and the application properties that you configure for each web service object. When you optimize the properties that affect web service concurrency, you can improve performance.
You can tune the hardware and the Hadoop cluster for better performance of Informatica big data products. This article provides tuning recommendations for Hadoop administrators and system administrators who set up the Hadoop cluster and hardware for Informatica big data products.
Learn how the Informatica domain and application services in Data Engineering meet disaster recovery and high availability requirements.
The Informatica domain consists of one or more servers, one or more installations of the Informatica software, and at least one relational database. This article is a discussion of how nodes work with the database, communications between nodes, what happens when a node dies, and basic troubleshooting on your domain.
You can enable users to log into Informatica web applications using single sign-on. This article explains how to configure single sign-on in an Informatica 10.2.x domain using Security Assertion Markup Language (SAML) and Microsoft Active Directory Federation Services (AD FS).
Effective in version 10.2.2, Informatica dropped support for the Hive engine. You can run mappings on the Blaze and Spark engines in the Hadoop environment or on the Databricks Spark engine in the Databricks environment. This article tells how to change the validation and run-time environments for mappings, and it describes processing …
You can enable users to log into the Administrator tool, the Analyst tool and the Monitoring tool using single sign-on. This article explains how to configure single sign-on in an Informatica domain using Security Assertion Markup Language (SAML) and Microsoft Active Directory Federation Services (AD FS).
The monitoring Model repository is a relational database instance. The monitoring Model Repository Service monitors the Data Integration Service jobs, and stores the statistics in the monitoring Model repository. This article discusses the methods that you can use to improve monitoring Model repository performance.
Released May 2019
Updated January 2022