-
In this course, we will show you how to use KNIME software to test and implement a data transformation workflow, automate its deployment and enable subsequent data monitoring and maintenance.
We will look at a use case of creating a data pipeline to manage order data for a restaurant franchise company that receives data from different outlets and show how to deploy the data transformation workflow manually or automatically and how to schedule and trigger the execution of data pipelines in a production environment.
In the first session of this course, you will learn how to prepare a data transformation workflow for deployment. In the second session, you will be introduced to the KNIME Business Hub and learn how to deploy a data pipeline as a scheduled or triggered execution. In the third session, you will learn about the different types of data pipelines - ETL and ELT - and how to use the Continuous Deployment for Data Science (CDDS) extension framework to enable automated deployment on KNIME Business Hub. Finally, in the fourth session, you will learn the best practices for producing data pipelines: the principles of data governance - quality, security and cataloging, orchestration and performance optimization.
-
Course Contents
-
- Preparing a Data Pipeline for Deployment
- Introduction to KNIME Business Hub
- ETL and ELT; Data Pipelines Validation and Deployment Automation
- Best Practices when Productionizing Data Pipelines
- Optional follow-up Q&A
Please note: This course consists of four 75-minute online sessions conducted by a KNIME data scientist. For each session there is an exercise that you can do at home. You will go through the solution together at the beginning of the next session. On day 5, the course ends with a 15-30 minute final session.
-
Target Group
-
You should be an advanced KNIME user.
-
Knowledge Prerequisites
-
You should already know how to create workflows and components with the KNIME Analytics Platform. Knowledge and experience equivalent to our advanced KNIME Analytics Platform courses (L2 level) is recommended.
You should already have the latest version of the KNIME Analytics Platform installed on your laptop, which you can download here: knime.com/downloads
-
Online training
- You wish to attend a course in online mode? We offer you online course dates for this course topic. To attend these seminars, you need to have a PC with Internet access (minimum data rate 1Mbps), a headset when working via VoIP and optionally a camera. For further information and technical recommendations, please refer to.

-
In this course, we will show you how to use KNIME software to test and implement a data transformation workflow, automate its deployment and enable subsequent data monitoring and maintenance.
We will look at a use case of creating a data pipeline to manage order data for a restaurant franchise company that receives data from different outlets and show how to deploy the data transformation workflow manually or automatically and how to schedule and trigger the execution of data pipelines in a production environment.
In the first session of this course, you will learn how to prepare a data transformation workflow for deployment. In the second session, you will be introduced to the KNIME Business Hub and learn how to deploy a data pipeline as a scheduled or triggered execution. In the third session, you will learn about the different types of data pipelines - ETL and ELT - and how to use the Continuous Deployment for Data Science (CDDS) extension framework to enable automated deployment on KNIME Business Hub. Finally, in the fourth session, you will learn the best practices for producing data pipelines: the principles of data governance - quality, security and cataloging, orchestration and performance optimization.
-
Course Contents
-
- Preparing a Data Pipeline for Deployment
- Introduction to KNIME Business Hub
- ETL and ELT; Data Pipelines Validation and Deployment Automation
- Best Practices when Productionizing Data Pipelines
- Optional follow-up Q&A
Please note: This course consists of four 75-minute online sessions conducted by a KNIME data scientist. For each session there is an exercise that you can do at home. You will go through the solution together at the beginning of the next session. On day 5, the course ends with a 15-30 minute final session.
-
Target Group
-
You should be an advanced KNIME user.
-
Knowledge Prerequisites
-
You should already know how to create workflows and components with the KNIME Analytics Platform. Knowledge and experience equivalent to our advanced KNIME Analytics Platform courses (L2 level) is recommended.
You should already have the latest version of the KNIME Analytics Platform installed on your laptop, which you can download here: knime.com/downloads
-
Online training
- You wish to attend a course in online mode? We offer you online course dates for this course topic. To attend these seminars, you need to have a PC with Internet access (minimum data rate 1Mbps), a headset when working via VoIP and optionally a camera. For further information and technical recommendations, please refer to.
