The default Airflow worker pod resource
requests limits are as follows:
- CPU requests = 1
- CPU limits = No limit
- Memory requests = 2 Gi
- Memory limits = 2 Gi
Also, you can set the custom resource requests for an Airflow task through
executor_config.
For example, sample DAG file is as
follows:
from airflow import DAG
from airflow.operators.python import PythonOperator
from pendulum import datetime
from kubernetes.client import models as k8s
config_resource_requirements = {
"pod_override": k8s.V1Pod(
spec=k8s.V1PodSpec(
containers=[
k8s.V1Container(
name="base",
resources=k8s.V1ResourceRequirements(
requests={"cpu": 0.5, "memory": "1024Mi"},
limits={"cpu": 0.5, "memory": "1024Mi"}
)
)
]
)
)
}
with DAG(
dag_id="python_operator_custom_resources",
start_date=datetime(2024, 1, 1),
schedule=None,
catchup=False,
):
PythonOperator(
task_id="hello_task",
python_callable=lambda: print("Hello!"),
executor_config=config_resource_requirements
)