ICode9

精准搜索请尝试: 精确搜索
首页 > 编程语言> 文章详细

python – Airflow BashOperator OSError:[Errno 2]没有这样的文件或目录

2019-09-10 15:55:24  阅读:383  来源: 互联网

标签:airflow-scheduler python permissions airflow


我不断从预定的BashOperator获得相同的错误,该BashOperator目前正在回填(它已经超过一个月“落后”).

[2018-06-10 22:06:33,558] {base_task_runner.py:115} INFO - Running: ['bash', '-c', u'airflow run dag_name task_name 2018-03-14T00:00:00 --job_id 50 --raw -sd DAGS_FOLDER/dag_file.py']
Traceback (most recent call last):
  File "/anaconda/bin//airflow", line 27, in <module>
    args.func(args)
  File "/anaconda/lib/python2.7/site-packages/airflow/bin/cli.py", line 387, in run
    run_job.run()
  File "/anaconda/lib/python2.7/site-packages/airflow/jobs.py", line 198, in run
    self._execute()
  File "/anaconda/lib/python2.7/site-packages/airflow/jobs.py", line 2512, in _execute
    self.task_runner.start()
  File "/anaconda/lib/python2.7/site-packages/airflow/task_runner/bash_task_runner.py", line 29, in start
    self.process = self.run_command(['bash', '-c'], join_args=True)
  File "/anaconda/lib/python2.7/site-packages/airflow/task_runner/base_task_runner.py", line 120, in run_command
    universal_newlines=True
  File "/anaconda/lib/python2.7/subprocess.py", line 394, in __init__
    errread, errwrite)
  File "/anaconda/lib/python2.7/subprocess.py", line 1047, in _execute_child
    raise child_exception
OSError: [Errno 2] No such file or directory
[2018-06-10 22:06:33,633] {sequential_executor.py:47} ERROR - Failed to execute task Command 'airflow run dag_name task_name 2018-03-14T00:00:00 --local -sd /var/lib/airflow/dags/dag_file.py' returned non-zero exit status 1.

我记得看到一些建议这可能是权限问题,但我无法弄清楚可能涉及哪些权限.

我正在使用systemd配置 – 在我的智慧结束时 – 我已经开始以root身份运行气流网络服务器和调度程序.

我可以在第一行中获取列表,并在ipython shell中逐字输入asgs到subprocess.Popen实例(因为它在airflow / task_runner / base_task_runner.py中;不保存envs)并且它不仅运行,而且它运行正确通知气流db任务完成.我可以用户Airflow,root或ubuntu这样做.

我已经将/ anaconda / bin添加到.bashrc中的PATH中,用于Airflow,root,ubuntu和/etc/bash.bashrc以及AIRFLOW_HOME的值,该值也在我的env文件/ etc / airflow中.

这就是我的systemd条目:

[Unit]
Description=Airflow scheduler daemon
After=network.target postgresql.service mysql.service redis.service rabbitmq-server.service
Wants=postgresql.service mysql.service redis.service rabbitmq-server.service    

[Service]
EnvironmentFile=/etc/airflow
User=root
Group=root
Type=simple
ExecStart=/anaconda/bin/airflow scheduler
Restart=always
RestartSec=5s    

[Install]
WantedBy=multi-user.target

我的环境文件:

PATH=$PATH:/anaconda/bin/
AIRFLOW_HOME=/var/lib/airflow
AIRFLOW_CONFIG=$AIRFLOW_HOME/airflow.cfg

使用apache-airflow == 1.9.0并急切寻求解决方案.提前致谢.

Airflow.cfg:

[core]
airflow_home = /var/lib/airflow
dags_folder = /var/lib/airflow/dags
base_log_folder = /var/lib/airflow/logs
remote_log_conn_id =
encrypt_s3_logs = False
logging_level = INFO
logging_config_class =
log_format = [%%(asctime)s] {%%(filename)s:%%(lineno)d} %%(levelname)s - %%(message)s
simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s
executor = SequentialExecutor
sql_alchemy_conn = {actual value hidden}
sql_alchemy_pool_size = 5
sql_alchemy_pool_recycle = 3600
parallelism = 4
dag_concurrency = 2
dags_are_paused_at_creation = True
non_pooled_task_slot_count = 16
max_active_runs_per_dag = 1
load_examples = False
plugins_folder = /var/lib/airflow/plugins
fernet_key = {actual value hidden}
donot_pickle = False
dagbag_import_timeout = 30
task_runner = BashTaskRunner
default_impersonation =
security =
unit_test_mode = False
task_log_reader = file.task
enable_xcom_pickling = True
killed_task_cleanup_time = 60
[cli]
api_client = airflow.api.client.local_client
endpoint_url = http://localhost:8080
[api]
auth_backend = airflow.api.auth.backend.default
[operators]
default_owner = root
default_cpus = 1
default_ram = 512
default_disk = 512
default_gpus = 0
[webserver]
base_url = http://localhost:8080
web_server_host = 0.0.0.0
web_server_port = 8080
web_server_ssl_cert =
web_server_ssl_key =
web_server_worker_timeout = 120
worker_refresh_batch_size = 1
worker_refresh_interval = 60
secret_key = temporary_key
workers = 1
worker_class = sync
access_logfile = -
error_logfile = -
expose_config = False
authenticate = False
filter_by_owner = False
owner_mode = user
dag_default_view = tree
dag_orientation = LR
demo_mode = False
log_fetch_timeout_sec = 5
hide_paused_dags_by_default = False
page_size = 100
[email]
email_backend = airflow.utils.email.send_email_smtp
[smtp]
smtp_host = localhost
smtp_starttls = True
smtp_ssl = False
smtp_port = 25
smtp_mail_from = airflow@example.com
[celery]
...
[dask]
cluster_address = 127.0.0.1:8786
[scheduler]
job_heartbeat_sec = 120
scheduler_heartbeat_sec = 120
run_duration = -1
min_file_process_interval = 0
dag_dir_list_interval = 300
print_stats_interval = 300
child_process_log_directory = /var/lib/airflow/logs/scheduler
scheduler_zombie_task_threshold = 900
catchup_by_default = True
max_tis_per_query = 0
statsd_on = False
statsd_host = localhost
statsd_port = 8125
statsd_prefix = airflow
max_threads = 1
authenticate = False
[ldap]
...
[mesos]
...
[kerberos]
...
[github_enterprise]
...
[admin]
hide_sensitive_variable_fields = True

添加ls -hal

root@ubuntu:/var/lib/airflow# ls -hal /var
total 52K
drwxr-xr-x 13 root root   4.0K Jun  3 11:58 .
root@ubuntu:/var/lib/airflow# ls -hal /var/lib
total 164K
drwxr-xr-x 42 root     root     4.0K Jun 10 19:00 .
root@ubuntu:/var/lib/airflow# ls -hal
total 40K
drwxr-xr-x  4 airflow airflow 4.0K Jun 11 06:41 .
drwxr-xr-x 42 root    root    4.0K Jun 10 19:00 ..
-rw-r--r--  1 airflow airflow  13K Jun 11 06:41 airflow.cfg
-rw-r--r--  1 airflow airflow  579 Jun 10 19:00 airflow.conf
drwxr-xr-x  2 airflow airflow 4.0K Jun 10 21:27 dags
drwxr-xr-x  4 airflow airflow 4.0K Jun 10 20:31 logs
-rw-r--r--  1 airflow airflow 1.7K Jun 10 19:00 unittests.cfg
root@ubuntu:/var/lib/airflow# ls -hal dags/
total 16K
drwxr-xr-x 2 airflow airflow 4.0K Jun 10 21:27 .
drwxr-xr-x 4 airflow airflow 4.0K Jun 11 06:41 ..
-rw-r--r-- 1 airflow airflow 3.4K Jun 10 21:26 dag_file.py
-rw-r--r-- 1 airflow airflow 1.7K Jun 10 21:27 dag_file.pyc

和dag_file.py的内容:

import airflow
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
default_args = {
    'owner': 'root',
    'run_as': 'root',
    'depends_on_past': True,
    'start_date': datetime(2018, 2, 20),
    'email': ['myemail@gmail.com'],
    'email_on_failure': False,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=5),
    'end_date': datetime(2018, 11, 15),
}
env = {
    'PSQL': '{obscured}',
    'PATH': '/anaconda/bin/:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin',
    'PWD': '/home/ubuntu/{obs1}/',
    'HOME': '/home/ubuntu',
    'PYTHONPATH': '/home/ubuntu/{obs1}',
}
dag = DAG(
    'dag_name',
    default_args=default_args,
    description='',
    schedule_interval=timedelta(days=1))
t1 = BashOperator(
    env=env,
    task_id='dag_file',
    bash_command='export PYTHONPATH=/home/ubuntu/{obs1} && /anaconda/bin/ipython $PYTHONPATH/{obs2}/{obs3}.py {{ ds }}',
    dag=dag)

我提醒你,这可以正确运行为气流,root和ubuntu:airflow run dag_name dag_file 2018-03-17T00:00:00 –job_id 55 –raw -sd DAGS_FOLDER / dag_file.py

解决方法:

它看起来像python版本不匹配,用适当的python版本编辑.bashrc并运行:

source .bashrc

这将解决您的问题.

对于我的情况,我们是usingexport PATH =“/ opt / miniconda3 / bin”:$PATH

另外要检查我怎么做到这个:

/opt/miniconda3/bin/python /opt/miniconda3/bin/airflow 

这就是我以前用来运行气流的方式.

标签:airflow-scheduler,python,permissions,airflow
来源: https://codeday.me/bug/20190910/1799817.html

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有