OSDir


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Console logging from `airflow test`?


Oops, forgot to mention, I'm on Python 3.6.4 and Airflow 1.10.0
On Sat, Sep 1, 2018 at 4:01 PM Dan Fuchs <fuchsd@xxxxxxxxx> wrote:
>
> Hi Airflow folks,
>
> First, if this is not the place for usage questions, I apologize. I'm
> glad to ask in the appropriate place if someone would be kind enough
> to point me to it.
>
> I'm not sure if this is a bug or not, but it is at least unexpected
> behavior to me.  I'm going through the tutorial at
> https://airflow.apache.org/tutorial.html.  I've got everything set up
> as it comes out-of-the-box: a sqlite db, a SequentialExecutor, and my
> AIRFLOW_HOME set to the default `~/airflow`.  When I try to test the
> print_date task using the `airflow test` command, I'm not seeing any
> output logged to the console from the BashOperator.  This is what I
> see:
>
> ```
> [dfuchs@dfuchslaptop airflowtest2]$ airflow test tutorial print_date 2016-01-10
> [2018-09-01 15:56:17,574] {__init__.py:51} INFO - Using executor
> SequentialExecutor
> [2018-09-01 15:56:17,671] {models.py:258} INFO - Filling up the DagBag
> from /home/dfuchs/airflow/dags
> ```
>
> If I use `airflow run`, I see all of the expected output in the log
> file that is generated (pasted to the end of this email).
>
> 1. Should I expect to see the task execution logs on the console when
> using `airflow test`? If so, I'm glad to file a bug
> 2. If this is the expected default behavior, is there a recommended
> way to get the task execution logs to show up in the console with
> `airflow test`?
>
> Thanks,
> Dan
>
>
> `airflow run` logs:
> ```
> [dfuchs@dfuchslaptop airflowtest2]$ airflow run tutorial print_date 2016-01-10
> [2018-09-01 15:58:11,608] {__init__.py:51} INFO - Using executor
> SequentialExecutor
> [2018-09-01 15:58:11,709] {models.py:258} INFO - Filling up the DagBag
> from /home/dfuchs/airflow/dags
> [2018-09-01 15:58:12,098] {cli.py:492} INFO - Running <TaskInstance:
> tutorial.print_date 2016-01-10T00:00:00+00:00 [None]> on host
> dfuchslaptop
> [2018-09-01 15:58:12,566] {__init__.py:51} INFO - Using executor
> SequentialExecutor
> [2018-09-01 15:58:12,667] {models.py:258} INFO - Filling up the DagBag
> from /home/dfuchs/airflow/dags/tutorial.py
> [2018-09-01 15:58:13,061] {cli.py:492} INFO - Running <TaskInstance:
> tutorial.print_date 2016-01-10T00:00:00+00:00 [None]> on host
> dfuchslaptop
> [dfuchs@dfuchslaptop airflowtest2]$
> [dfuchs@dfuchslaptop airflowtest2]$
> [dfuchs@dfuchslaptop airflowtest2]$ cat
> ~/airflow/logs/tutorial/print_date/2016-01-10T00\:00\:00+00\:00/1.log
> [2018-09-01 15:58:12,098] {logging_mixin.py:95} INFO - Sending to executor.
>
> [2018-09-01 15:58:12,099] {logging_mixin.py:95} INFO - [2018-09-01
> 15:58:12,099] {base_executor.py:56} INFO - Adding to queue: airflow
> run tutorial print_date 2016-01-10T00:00:00+00:00 --local -sd
> DAGS_FOLDER/tutorial.py
>
> [2018-09-01 15:58:12,102] {logging_mixin.py:95} INFO - [2018-09-01
> 15:58:12,102] {sequential_executor.py:45} INFO - Executing command:
> airflow run tutorial print_date 2016-01-10T00:00:00+00:00 --local -sd
> DAGS_FOLDER/tutorial.py
>
> [2018-09-01 15:58:13,184] {models.py:1335} INFO - Dependencies all met
> for <TaskInstance: tutorial.print_date 2016-01-10T00:00:00+00:00
> [None]>
> [2018-09-01 15:58:13,187] {models.py:1335} INFO - Dependencies all met
> for <TaskInstance: tutorial.print_date 2016-01-10T00:00:00+00:00
> [None]>
> [2018-09-01 15:58:13,188] {models.py:1547} INFO -
> --------------------------------------------------------------------------------
> Starting attempt 1 of 2
> --------------------------------------------------------------------------------
>
> [2018-09-01 15:58:13,204] {models.py:1569} INFO - Executing
> <Task(BashOperator): print_date> on 2016-01-10T00:00:00+00:00
> [2018-09-01 15:58:13,205] {base_task_runner.py:124} INFO - Running:
> ['bash', '-c', 'airflow run tutorial print_date
> 2016-01-10T00:00:00+00:00 --job_id 12 --raw -sd
> DAGS_FOLDER/tutorial.py --cfg_path /tmp/tmp8yiad073']
> [2018-09-01 15:58:13,716] {base_task_runner.py:107} INFO - Job 12:
> Subtask print_date [2018-09-01 15:58:13,716] {__init__.py:51} INFO -
> Using executor SequentialExecutor
> [2018-09-01 15:58:13,810] {base_task_runner.py:107} INFO - Job 12:
> Subtask print_date [2018-09-01 15:58:13,810] {models.py:258} INFO -
> Filling up the DagBag from /home/dfuchs/airflow/dags/tutorial.py
> [2018-09-01 15:58:14,191] {base_task_runner.py:107} INFO - Job 12:
> Subtask print_date [2018-09-01 15:58:14,190] {cli.py:492} INFO -
> Running <TaskInstance: tutorial.print_date 2016-01-10T00:00:00+00:00
> [running]> on host dfuchslaptop
> [2018-09-01 15:58:14,243] {bash_operator.py:74} INFO - Tmp dir root location:
>  /tmp
> [2018-09-01 15:58:14,244] {bash_operator.py:87} INFO - Temporary
> script location: /tmp/airflowtmp13m26f1r/print_daten9v1cptf
> [2018-09-01 15:58:14,244] {bash_operator.py:97} INFO - Running command: date
> [2018-09-01 15:58:14,250] {bash_operator.py:106} INFO - Output:
> [2018-09-01 15:58:14,276] {bash_operator.py:110} INFO - Sat Sep  1
> 15:58:14 CDT 2018
> [2018-09-01 15:58:14,276] {bash_operator.py:114} INFO - Command exited
> with return code 0
> [2018-09-01 15:58:18,162] {logging_mixin.py:95} INFO - [2018-09-01
> 15:58:18,161] {jobs.py:2612} INFO - Task exited with return code 0
> ```