Passing a Dictionary to a Custom Operator in Apache Airflow

Introduction to the Topic

In this comprehensive guide, we will delve into the process of passing a dictionary to a custom operator in Apache Airflow. As you navigate through the world of Airflow, there comes a time when the standard operators fall short, and you need to customize your workflow. This customization often involves passing intricate data structures like dictionaries between tasks.

What You Will Learn

By following this tutorial, you will learn how to create a custom operator and effectively pass a dictionary as an argument. This knowledge is crucial for developing advanced workflows that demand dynamic inputs and intricate data handling.

Understanding the Problem and Solution

The challenge lies in ensuring that our custom operator not only accepts a dictionary as input but also utilizes it efficiently within its execution logic. The solution entails defining our custom operator to inherit from Airflow’s BaseOperator while implementing additional handling for dictionary-type arguments.

To implement this solution successfully, we will explore Python’s class inheritance concept and harness Airflow’s templating capabilities. This approach guarantees that our custom operator remains flexible and robust enough to handle diverse use cases involving dictionaries seamlessly.


from airflow.models import BaseOperator
from airflow.utils.decorators import apply_defaults

class MyCustomOperator(BaseOperator):
    def __init__(self, my_dict: dict, *args, **kwargs):
        super(MyCustomOperator, self).__init__(*args, **kwargs)
        self.my_dict = my_dict

    def execute(self, context):
        # Example use of the passed dictionary.
        for key, value in self.my_dict.items():
  "{key}: {value}")

# Copyright PHD


Our implementation begins with importing essential components from Airflow’s library�BaseOperator for crafting custom operators and apply_defaults decorator for managing default values effectively.

Subsequently, we define MyCustomOperator, where: – The __init__ method is defined to accept my_dict as an argument alongside any other args or kwargs. – Within the execute method, the dictionary is iterated over to log each key-value pair. This showcases how you can access and utilize your passed data within your task’s logic effectively.

This example illustrates how dynamic inputs can be seamlessly integrated into your workflows, enhancing their versatility across various scenarios requiring more than static configurations.

  1. How do I pass multiple dictionaries to my custom operator?

  2. You can modify the constructor (__init__) method signature of your custom operator to accept multiple dictionaries by specifying additional parameters.

  3. Can I use Jinja templating with my dictionaries?

  4. Yes! Ensure your template fields are correctly defined in your custom operator class. You can then reference variables using Jinja syntax within those dictionaries if needed.

  5. Is there any limitation on what kind of data my dictionary can hold?

  6. Not particularly; however, remember that complex objects may need special consideration if they aren’t serializable or if they don’t play well with Jinja templating (when used).

  7. How do I make sure my passed-in dictionary doesn’t affect concurrency or parallelism negatively?

  8. Operational considerations like concurrency depend more on overall DAG design rather than individual task inputs. However ensuring minimal side effects during execution and avoiding global state changes are good practices.

  9. What happens if I pass an empty dictionary?

  10. Your custom operator should be designed robustly enough to handle such cases gracefully�either by skipping certain operations or providing defaults where applicable.


Mastering the art of creating and utilizing customized operators significantly elevates one�s ability to orchestrate complex workflows using Apache Airflow efficiently. By understanding techniques such as passing dynamic structures like dictionaries between tasks, you gain endless possibilities for tailoring automation pipelines precisely according to specific needs at hand.

Leave a Comment