Table of Contents

Search

  1. Preface
  2. Introduction to Informatica Data Engineering Integration
  3. Mappings
  4. Mapping Optimization
  5. Sources
  6. Targets
  7. Transformations
  8. Python Transformation
  9. Data Preview
  10. Cluster Workflows
  11. Profiles
  12. Monitoring
  13. Hierarchical Data Processing
  14. Hierarchical Data Processing Configuration
  15. Hierarchical Data Processing with Schema Changes
  16. Intelligent Structure Models
  17. Blockchain
  18. Stateful Computing
  19. Appendix A: Connections Reference
  20. Appendix B: Data Type Reference
  21. Appendix C: Function Reference

Step 1. Collect the Data

Step 1. Collect the Data

Identify the data sources from which you need to collect the data.
Data Engineering Integration provides several ways to access your data in and out of Hadoop based on the data types, data volumes, and data latencies in the data.
You can use PowerExchange adapters to connect to multiple data engineering sources. You can schedule batch loads to move data from multiple source systems to HDFS without the need to stage the data. You can move changed data from relational and mainframe systems into HDFS or the Hive warehouse. For real-time data feeds, you can move data off message queues and into HDFS.
You can collect the following types of data:
  • Transactional
  • Interactive
  • Log file
  • Sensor device
  • Document and file
  • Industry format

0 COMMENTS

We’d like to hear from you!