I am using airflow stable helm chart and using Kubernetes Executor, new pod is being scheduled for dag but its failing with dag_id could not be found issue. I am using git-sync to get dags. Below is the error and kubernetes config values. Can someone please help me resolve this issue?Error:[2020-07-01 23:18:36,939] {__init__.py:51} INFO - Using executor LocalExecutor[2020-07-01 23:18:36,940] {dagbag.py:396} INFO - Filling up the DagBag from /opt/airflow/dags/dags/etl/sampledag_dag.pyTraceback (most recent call last): File "/home/airflow/.local/bin/airflow", line 37, in <module> args.func(args) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 75, in wrapper return f(*args, **kwargs) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/bin/cli.py", line 523, in run dag = get_dag(args) File "/home/airflow/.local/lib/python3.6/site-packages/airflow/bin/cli.py", line 149, in get_dag 'parse.'.format(args.dag_id))airflow.exceptions.AirflowException: dag_id could not be found: sampledag . Either the dag did not exist or it failed to parse.Config: AIRFLOW__KUBERNETES__DELETE_WORKER_PODS: false AIRFLOW__KUBERNETES__GIT_REPO: git@git.com/dags.git AIRFLOW__KUBERNETES__GIT_BRANCH: master AIRFLOW__KUBERNETES__GIT_DAGS_FOLDER_MOUNT_POINT: /dags AIRFLOW__KUBERNETES__GIT_SSH_KEY_SECRET_NAME: git-secret AIRFLOW__KUBERNETES__WORKER_CONTAINER_REPOSITORY: airflow-repo AIRFLOW__KUBERNETES__WORKER_CONTAINER_TAG: tag AIRFLOW__KUBERNETES__RUN_AS_USER: "50000"sampledagimport loggingimport datetimefrom airflow import modelsfrom airflow.contrib.operators import kubernetes_pod_operatorimport osargs = { 'owner': 'airflow'}YESTERDAY = datetime.datetime.now() - datetime.timedelta(days=1)try: print("Entered try block") with models.DAG( dag_id='sampledag', schedule_interval=datetime.timedelta(days=1), start_date=YESTERDAY) as dag: print("Initialized dag") kubernetes_min_pod = kubernetes_pod_operator.KubernetesPodOperator( # The ID specified for the task. task_id='trigger-task', # Name of task you want to run, used to generate Pod ID. name='trigger-name', namespace='scheduler', in_cluster = True, cmds=["./docker-run.sh"], is_delete_operator_pod=False, image='imagerepo:latest', image_pull_policy='Always', dag=dag) print("done")except Exception as e: print(str(e)) logging.error("Error at {}, error={}".format(__file__, str(e))) raise 解决方案 I had the same issue. I solved it by adding the following to my config:AIRFLOW__KUBERNETES__DAGS_VOLUME_SUBPATH: repo/What was happening is that the init container will download your dags in [AIRFLOW__KUBERNETES__GIT_DAGS_FOLDER_MOUNT_POINT]/[AIRFLOW__KUBERNETES__GIT_SYNC_DEST] and AIRFLOW__KUBERNETES__GIT_SYNC_DEST by default is repo (https://airflow.apache.org/docs/stable/configurations-ref.html#git-sync-dest) 这篇关于使用kubernetes执行器时找不到airflow- dag_id问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持! 上岸,阿里云!
07-04 07:07