Install Airflow: 4 Easy Steps

Last Modified: September 6th, 2023


Apache Airflow is a tool that can create, organize, and monitor workflows. It is open-source hence it is free and has a wide range of support as well. It is one of the most trusted platforms that is used for orchestrating workflows and is widely used and recommended by top data engineers. This tool provides many features like a proper visualization of the data pipelines and workflows, the status of the workflows, the data logs, and codes as well in quite a detail. The process to Install Airflow is a difficult one but it provides the following benefits.

Airflow provides a rich user interface, which makes the task of visualizing pipelines and monitoring them a breeze. This interface also allows us to see the status of the workflows and troubleshoot the issues when required. It has many readily available connections that allow connecting with multiple sources. Airflow also provides an option to send alerts when the transfer is successful or even when it fails. Airflow is a distributed system, that is highly scalable, and can be connected to various sources making it flexible. These features allow it to be used efficiently in the orchestration of complex workflow and data pipelining problems.

This article provides a step-by-step guide to Install Airflow on your system.

Table of Contents

What is Apache Airflow?

install airflow: airflow logo | Hevo Data
Image Source:

Apache Airflow is a workflow engine that helps in scheduling and running data pipelines that are complex. Airflow makes sure that all the steps of the data pipeline get executed in the predefined order and all the tasks get the resources based on the requirement.

Apache Airflow is a platform that is used for proper monitoring, scheduling, and executing complex workflows. It is an open-source platform that is useful in creating the architecture of workflows. Airflow is one of the most powerful open source data pipeline platforms currently in the market.

Airflow uses DAG ( Directed Acyclic Graphs) to structure and represent the workflows where nodes of DAG represent the tasks. The ideology behind Airflow’s design is that all the data pipelines can be expressed as a code and it soon became a platform where the workflows can iterate quickly and utilize the code-first platforms.

Features of Apache Airflow

  • Ease of use: Deploying airflow is easy as it requires just a little bit of knowledge in python.
  • Open Source: It is free to use, open-source platform that results in a lot of active users.
  • Good Integrations: It has readily available integrations that allow working with platforms like Google Cloud, Amazon AWS, and many more. 
  • Standard Python for coding: Relatively little knowledge of python can help in creating complex workflows
  • User Interface: Airflow’s UI helps in monitoring and managing the workflows. It also provides a view of the status of tasks.
  • Dynamic: All the tasks of python can be performed in airflow since it is based on python itself.
  • Highly Scalable: Airflow allows the execution of thousands of different tasks per day. 
Simplify Data Analysis with Hevo’s No-code Data Pipeline

Hevo Data, a No-code Data Pipeline helps to load data from any data source such as Databases, SaaS applications, Cloud Storage, SDKs, and Streaming Services and simplifies the ETL process. It supports 100+ data sources (including 30+ free data sources) like Asana and is a 3-step process by just selecting the data source, providing valid credentials, and choosing the destination. Hevo not only loads the data onto the desired Data Warehouse/destination but also enriches the data and transforms it into an analysis-ready form without having to write a single line of code.


Its completely automated pipeline offers data to be delivered in real-time without any loss from source to destination. Its fault-tolerant and scalable architecture ensure that the data is handled in a secure, consistent manner with zero data loss and supports different forms of data. The solutions provided are consistent and work with different BI tools as well.

Check out why Hevo is the Best:

  • Secure: Hevo has a fault-tolerant architecture that ensures that the data is handled in a secure, consistent manner with zero data loss.
  • Schema Management: Hevo takes away the tedious task of schema management & automatically detects the schema of incoming data and maps it to the destination schema.
  • Minimal Learning: Hevo, with its simple and interactive UI, is extremely simple for new customers to work on and perform operations.
  • Hevo Is Built To Scale: As the number of sources and the volume of your data grows, Hevo scales horizontally, handling millions of records per minute with very little latency.
  • Incremental Data Load: Hevo allows the transfer of data that has been modified in real-time. This ensures efficient utilization of bandwidth on both ends.
  • Live Support: The Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.
  • Live Monitoring: Hevo allows you to monitor the data flow and check where your data is at a particular point in time.

How to Install Airflow?

Apache Airflow helps in workflow management. It is a difficult task to install airflow, but the benefits it provides sometimes overweight the difficulties since it is mostly a one-time process.

The process to Install airflow is 4 steps. 

Let us understand the steps in detail, to install airflow.

1) Installing Ubuntu:

Ubuntu is a Linux operating system. It provides a more controlled environment over all the functionalities of the system. 

  • Before installing Ubuntu, turn on the developer option on the windows system.
  • Enable the subsystem for the Linux option located in Windows Features.
  • Download the visual C++ module.
  • Install Ubuntu from Microsoft Store or use an ISO file. start the installation process
  • A terminal will open where you need to enter your username and password. The terminal doesn’t show the password that you type in.
Ubuntu Installing, this may take a few minutes... Please create a default UNIX user account. The username does not need to match your Windows username. For more information visit: Enter new UNIX username: bull87
  • Bash command can be used if you closed the terminal after the last step and reopen it. Bash command helps to communicate with the computer.
C:Usersjacks>bash To run a command as administrator (user "root"), use "sudo <command>". See "man sudo root" for details.

2) Installing PIP

Pip is a tool that manages and is designed to install the packages that are written for python and written in python. Pip is required to download Apache Airflow. Run through the following code commands to implement this step:

sudo apt-get install software-properties-common
sudo apt-add-repository universe
sudo apt-get update
sudo apt-get install python-setuptools
sudo apt install python3-pip
sudo -H pip3python install --upgrade pip

3) Install Airflow Dependencies:

For airflow to work properly you need to install all its dependencies. Without dependencies Airflow cannot function to its potential i.e, there would be a lot of missing features and may even give bugs. To avoid it run the following commands and install all dependencies.

sudo apt-get install libmysqlclient-dev
sudo apt-get install libssl-dev
sudo apt-get install libkrb5-dev

Airflow uses SQLite as its default database

4) Install Airflow:

Run the following command to finally install airflow on your system.

export AIRFLOW_HOME=~/airflowpip3 install apache-airflowpip3 install typing_extensions# initialize the database
airflow initdb

# start the web server, default port is 8080
airflow webserver -p 8080# start the scheduler. I recommend opening up a separate terminal #window for this step
airflow scheduler

# visit localhost:8080 in the browser and enable the example dag in the home page

After you execute the following commands, the process to install airflow on your system is complete. This allows you to access and utilize the complete potential of the airflow tool.


Airflow is one of the most powerful workflow management tools in the market. By utilizing its potential companies are able to solve their problems and complete tasks on time with efficiency. This article gave a comprehensive guide on Airflow and then a step-by-step guide to installing airflow in an easy manner.

Airflow is a trusted source that a lot of companies use as it is an open-source platform. But creating pipelines, installing them on the system, monitoring pipelines, all these are very difficult on Airflow as it is a completely coding platform and it would require a lot of expertise to run properly. This issue can be solved by a platform that creates data pipelines with any code. The Automated data pipeline can be used in place of fully manual solutions to reduce the effort and attain maximum efficiency and this0 is where Hevo comes into the picture. Hevo Data is a No-code Data Pipeline and has awesome 100+ pre-built Integrations that you can choose from.

visit our website to explore hevo[/hevoButton]

Hevo can help you Integrate your data from numerous sources and load them into a destination to Analyze real-time data with a BI tool such as Tableau. It will make your life easier and data migration hassle-free. It is user-friendly, reliable, and secure.

SIGN UP for a 14-day free trial and see the difference!

Share your experience of learning about the steps to Install Airflow in the comments section below.

Former Research Analyst, Hevo Data

Arsalan is a data science enthusiast with a keen interest towards data analysis and architecture and is interested in writing highly technical content. He has experience writing around 100 articles on various topics related to data industry.

No-code Data Pipeline For Your Data Warehouse