CloudConnect Designer enables project developers to assemble, test, and deploy ETL processes and logical data models to their projects in the GoodData Platform.
CloudConnect projects are developed locally in the Java-based CloudConnect Designer application, which enables the development and testing of the following:
- ETL Graphs - For each data source that you want to integrate, develop an ETL graph to extract the data, transform it, and load it into the project. CloudConnect Designer enables rapid development of ETL graphs through a click-and-drag interface.
ETL graphs can be tested locally in CloudConnect Designer before you publish them to your project.
- Logical Data Model - Each project requires a logical data model (LDM), which defines the attributes, facts, and datasets stored in the project and their relationships. Project developers can build logical data models in the LDM Modeler, an integrated component of CloudConnect Designer.
From the CloudConnect Designer interface, you can publish your ETL graphs and data models to the project or projects of your choice.
When published, a logical data model is applied to an individual project in the GoodData Platform.
- When an LDM is applied to the project, the platform creates or modifies the physical data model, which defines the actual tables used to store the data in the datastore. In this manner, developers do not need to manage the technical details of building a BI datamart.
- The LDM is also used to process any query for report data issued from the project in the GoodData Portal.
Similarly, ETL graphs become processes in the GoodData Platform.
- Published processes can be scheduled for execution and monitored through the Data Integration Console, a web console accessible from the GoodData Portal. See Data Integration Console.
- ETL graphs can be deployed to other projects through CloudConnect Designer. You can also deploy them via API. See the API Reference.
After the CloudConnect project has been deployed and the ETL process or processes have been scheduled, data is gathered from the configured Data Sources according to the defined schedule. Within the platform, the data is transformed according to the definition of the ETL process and then loaded into the datastore based on the definition of the logical data model.