We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I am using Airflow using Docker-Compose.
My directory structure looks like this:
. ├── configs/ │ └── config.yaml └── main.py
My main.py looks like
from airflow import DAG import dagfactory import os CUR_DIR = os.path.abspath(os.path.dirname(__file__)) print(CUR_DIR) dag_factory = dagfactory.DagFactory(f"{CUR_DIR}/configs/config.yaml") dag_factory.clean_dags(globals()) dag_factory.generate_dags(globals())
And my config.yaml looks like
example_dag1: default_args: owner: 'example_owner' start_date: 2018-01-01 # or '2 days' end_date: 2018-01-05 retries: 1 retry_delay_sec: 300 schedule_interval: '0 3 * * *' concurrency: 1 max_active_runs: 1 dagrun_timeout_sec: 60 default_view: 'tree' orientation: 'LR' description: 'this is an example dag!' tasks: task_1: operator: airflow.operators.bash_operator.BashOperator bash_command: 'echo 1' task_2: operator: airflow.operators.bash_operator.BashOperator bash_command: 'echo 2' dependencies: [task_1] task_3: operator: airflow.operators.bash_operator.BashOperator bash_command: 'echo 3' dependencies: [task_1]
In my Airflow I get the following error:
Broken DAG: [/opt/airflow/dags/main.py] Traceback (most recent call last): File "<frozen importlib._bootstrap_external>", line 859, in get_code File "<frozen importlib._bootstrap_external>", line 916, in get_data FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/airflow/dags/print_hello.py' The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/airflow/.local/lib/python3.7/site-packages/dagfactory/dagfactory.py", line 149, in clean_dags dags: Dict[str, Any] = self.build_dags() File "/home/airflow/.local/lib/python3.7/site-packages/dagfactory/dagfactory.py", line 116, in build_dags ) from err Exception: Failed to generate dag example_dag1. verify config is correct
I am not sure what the issue is. Where is '/usr/local/airflow/dags/print_hello.py'
'/usr/local/airflow/dags/print_hello.py'
The text was updated successfully, but these errors were encountered:
I changed the content of the python file to:
# from airflow import DAG # import dagfactory import os CUR_DIR = os.path.abspath(os.path.dirname(__file__)) print(CUR_DIR) # dag_factory = dagfactory.DagFactory(f"{CUR_DIR}/configs/config.yaml") # dag_factory.clean_dags(globals()) # dag_factory.generate_dags(globals()) # Temporary to check if loading config works with same path. with open(f"{CUR_DIR}/configs/config.yaml", "r") as stream: try: import yaml configYaml = yaml.safe_load(stream) print(configYaml) except yaml.YAMLError as exc: print(exc)
And executed the file from the Docker shell and got the desired output:
/opt/airflow/dags {'example_dag1': {'default_args': {'owner': 'example_owner', 'start_date': datetime.date(2018, 1, 1), 'end_date': datetime.date(2018, 1, 5), 'retries': 1, 'retry_delay_sec': 300}, 'schedule_interval': '0 3 * * *', 'concurrency': 1, 'max_active_runs': 1, 'dagrun_timeout_sec': 60, 'default_view': 'tree', 'orientation': 'LR', 'description': 'this is an example dag!', 'tasks': {'task_1': {'operator': 'airflow.operators.bash_operator.BashOperator', 'bash_command': 'echo 1'}, 'task_2': {'operator': 'airflow.operators.bash_operator.BashOperator', 'bash_command': 'echo 2', 'dependencies': ['task_1']}, 'task_3': {'operator': 'airflow.operators.bash_operator.BashOperator', 'bash_command': 'echo 3', 'dependencies': ['task_1']}}}}
Sorry, something went wrong.
No branches or pull requests
I am using Airflow using Docker-Compose.
My directory structure looks like this:
My main.py looks like
And my config.yaml looks like
In my Airflow I get the following error:
I am not sure what the issue is. Where is
'/usr/local/airflow/dags/print_hello.py'
The text was updated successfully, but these errors were encountered: