Table of Contents

Search

  1. Preface
  2. Introduction to Informatica Data Engineering Integration
  3. Mappings
  4. Mapping Optimization
  5. Sources
  6. Targets
  7. Transformations
  8. Python Transformation
  9. Data Preview
  10. Cluster Workflows
  11. Profiles
  12. Monitoring
  13. Hierarchical Data Processing
  14. Hierarchical Data Processing Configuration
  15. Hierarchical Data Processing with Schema Changes
  16. Intelligent Structure Models
  17. Blockchain
  18. Stateful Computing
  19. Appendix A: Connections Reference
  20. Appendix B: Data Type Reference
  21. Appendix C: Function Reference

Deploy and Run the Workflow

Deploy and Run the Workflow

After you complete the cluster workflow, deploy and run the workflow.
You can monitor AWS and Azure cluster workflows on the web console. If you configured a log location, view the logs at the location that you configured in the Create Cluster task properties.
You can monitor Databricks cluster workflows on the Databricks workspace.
You can also monitor Data Integration Service jobs in the Administrator tool.
After the workflow begins executing tasks, the task to provision the cluster may take several minutes.


Updated September 28, 2020