Switch to English Site

dotsdots

Building ETL and Data Pipelines with Bash, Airflow and Kafka

描述

This course provides you with practical skills to build and manage data pipelines and Extract, Transform, Load (ETL) processes using shell scripts, Airflow and Kafka.

Well-designed and automated data pipelines and ETL processes are the foundation of a successful Business Intelligence platform. Defining your data workflows, pipelines and processes early in the platform design ensures the right raw data is collected, transformed and loaded into desired storage layers and available for processing and analysis as and when required.

This course is designed to provide you the critical knowledge and skills needed by Data Engineers and Data Warehousing specialists to create and manage ETL, ELT, and data pipeline processes.阅读更多.

此资源由附属合作伙伴提供。 如果您支付培训费用,我们可能会赚取佣金来支持该网站。

按照数据工作岗位排列职业相关性

Building ETL and Data Pipelines with Bash, Airflow and Kafka 中涵盖的技术和工具与 数据工程师 招聘广告中的要求最为相似。

相似度得分(满分 100)

学习顺序

Building ETL and Data Pipelines with Bash, Airflow and Kafka is a part of 一 structured learning path.

None
DataKwery

17 Courses

Free Data Engineer