How to Stop or Kill Airflow Tasks: 2 Easy Methods

By: Published: July 29, 2022

Airflow Stop DAG - Featured Image

In today’s data-driven world, enterprises extensively use data pipelines to enable quick data exploration for essential business insights.

Pipelines created around extract, transform, and load (ELT) processes pose a challenge to businesses. A single error can lead to difficulties like low data quality, reducing consumer confidence, and tedious maintenance.

To mitigate such challenges, companies can use tools like Apache Airflow to monitor, schedule, and stop data pipelines. 

In this article, you will learn how to stop or kill Airflow tasks via the user interface of Apache Airflow.

Table of Contents

Prerequisites

Understanding of data pipelines and workflows.

What are DAGs?

DAGs in Apache Airflow Tasks
Image Source

DAGs are commonly known as directed acyclic graphs in computer science and mathematics. It is a directed graph, but without any cycles, it connects to all the other edges, meaning it is impossible to traverse through the entire graph starting at one edge.

As the direction of the edges in the graph is unidirectional, you can go only one way. DAGs are primarily used for topological sorting, where each node in the graph is in a specific order.

For example, a spreadsheet represents a DAG, each cell is a vertex, and the edge is connected to the cell when a formula reference of another cell is utilized. Directed acyclic graphs are mainly used in circuit design, Bayesian network, compiler structure, and scheduling.

In Apache Airflow, DAGs are used for project management to plan, design, structure, and implement complex tasks. It is a collection of the tasks you want to execute and is organized in a method that helps understand the relationships and dependencies of each task.

DAGs are defined in Python code and are placed in DAG_FOLDER. Apache Airflow will run each code in every file dynamically to build  DAG objects. There is no upper limit; you can create as many DAGs as you want and describe an arbitrary number of tasks to them. However, ideally, each DAGs should represent one single logical data pipeline or workflow. 

What are Airflow Tasks?

In Apache Airflow, a task is a basic execution unit, and they are drafted into DAGs. They have upstream and downstream dependencies sets between them to decide the order in which they should be executed.

Upstream is a task that must reach a particular status before a dependent task can be executed. In contrast, downstream is a task that cannot be executed until the upstream task reaches a particular state.

There are no limits to the number of tasks that can be added to a single DAG. Users can set the concurrency limit for the execution time, the limit for concurrent DAG runs for a particular DAG, and the maximum number of parallel tasks.

Three types of Apache Airflow tasks exist – operators, sensors, and task flow decorators. While operators are predefined task templates that can be combined to create the majority of the DAGs, sensors are the operator’s unique subclass that is completed about waiting for an external trigger to happen.

On the other hand, a task flow decorated @task is customizable Python functions bundled up as a task. All of these tasks are subclasses of BaseOperator, Operators, and Sensors are easy-to-go templates you can call in a DAG file.

Scale your data integration effortlessly with Hevo’s Fault-Tolerant No Code Data Pipeline

As the ability of businesses to collect data explodes, data teams have a crucial role to play in fueling data-driven decisions. Yet, they struggle to consolidate the scattered data in their warehouse to build a single source of truth. Broken pipelines, data quality issues, bugs and errors, and lack of control and visibility over the data flow make data integration a nightmare.

1000+ data teams rely on Hevo’s Data Pipeline Platform to integrate data from over 150+ sources in a matter of minutes. Billions of data events from sources as varied as SaaS apps, Databases, File Storage and Streaming sources can be replicated in near real-time with Hevo’s fault-tolerant architecture. What’s more – Hevo puts complete control in the hands of data teams with intuitive dashboards for pipeline monitoring, auto-schema management, and custom ingestion/loading schedules. 

All of this combined with transparent pricing and 24×7 support makes us the most loved data pipeline software on review sites.

Take our 14-day free trial to experience a better way to manage data pipelines.

Get started for Free with Hevo!

How to Stop or Kill Airflow Tasks?

Apache Airflow has no data input or output concept but primarily focuses on data flow. You can manage workflow processes, trigger tasks, check current status, and schedule tasks through code and visualization. You can manage your project and related data with Apache Airflow.

It is also helpful in systematically creating parts of machine learning workflows. The code can be reused for various models or datasets to solve complex problems and get meaningful insights. 

Apache Airflow tasks are structures in the form of DAGs, but there are some scenarios where you might need to kill or stop tasks.

  • Using DAGs Screen
  • Setting the Airflow Task to a Failed State

Method 1: Using DAGs Screen

  • Go to the DAGs screen, where you can see the currently running tasks.
DAGs Screen
Image Source
  • Click on the running icon under the Recent Task section.
  • Airflow will automatically run the search query with the appropriate filters for the select DAG Id and state. However, you can manually do so by going to the task instances under the tab browser.
  • The task will be displayed on the task instances screen.
  • Select the task you want to delete.

Method 2: Setting the Airflow Task to a Failed State

  • Select the task you want to kill.
Setting the task to a Failed State
Image Source
  • Mark the task as a “Failed” state
“Failed” state
Image Source
  • All the subsequent tasks will also be marked as failed.
  • Click on the Okay button.

Limitations

Ideally, tasks and DAGs should not be stopped or deleted in Apache Airflow, as once a task’s execution has started, stopping it is a tedious process. If you stop a DAG and clear the task from the UI, the running tasks in the executor will not stop. When the task is in a running state, you can click on CLEAR, and it will call job.kill() function on the task. This function will set the task’s status to shut_down, which will be shifted to up_for_retry immediately. 

Conclusion

In this article, you learned about the fundamental aspects of how to stop or kill Apache Airflow Tasks. That said, Apache Airflow is not an ELT tool, but it can manage and organize ELT pipelines via DAGs to help enterprises with the different steps involved in batch jobs. With Apache Airflow, businesses can monitor their workflow’s developed and current status, giving them insights into areas they need to improve.

Visit our Website to Explore Hevo

Hevo Data will automate your data transfer process, hence allowing you to focus on other aspects of your business like Analytics, Customer Management, etc. Hevo provides a wide range of sources – 150+ Data Sources (including 40+ Free Sources) – that connect with over 15+ Destinations and load them into a destination to analyze real-time data at transparent pricing and make Data Replication hassle-free.

Want to take Hevo for a ride? Sign Up for a 14-day free trial and simplify your Data Integration process. Do check out the pricing details to understand which plan fulfills all your business needs.

Share your experience of learning how Airflow stop DAG tasks in the comment section below! We would love to hear your thoughts.

mm
Freelance Technical Content Writer, Hevo Data

Vidhi possesses a deep enthusiasm for data science, with a passion for writing about data, software architecture, and integration. She loves to solve business problems through tailored content for data teams.

No-code Automated Data Pipeline for your Data Warehouse