status
stringclasses 1
value | repo_name
stringclasses 13
values | repo_url
stringclasses 13
values | issue_id
int64 1
104k
| updated_files
stringlengths 11
1.76k
| title
stringlengths 4
369
| body
stringlengths 0
254k
⌀ | issue_url
stringlengths 38
55
| pull_url
stringlengths 38
53
| before_fix_sha
stringlengths 40
40
| after_fix_sha
stringlengths 40
40
| report_datetime
timestamp[ns, tz=UTC] | language
stringclasses 5
values | commit_datetime
timestamp[us, tz=UTC] |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
closed | apache/airflow | https://github.com/apache/airflow | 12,028 | ["airflow/operators/email.py", "tests/operators/test_email.py"] | Add `files` to templated fields of `EmailOperator` | **Description**
Files are not part of the templated fields in the `EmailOperator` https://airflow.apache.org/docs/stable/_modules/airflow/operators/email_operator.html
Whle in fact file names should also be templated .
**Use case / motivation**
We want to store files accordingly to the DAGs execution date and further send them to email recipients like so:
```
send_report_email = EmailOperator(
....
files=[
"/tmp/Report-A-{{ execution_date.strftime('%Y-%m-%d') }}.csv",
"/tmp/Reportt-B-{{ execution_date.strftime('%Y-%m-%d') }}.csv"
]
```
| https://github.com/apache/airflow/issues/12028 | https://github.com/apache/airflow/pull/12435 | 4873d9759dfdec1dd3663074f9e64ad69fa881cc | 9b9fe45f46455bdb7d3702ba4f4524574f11f75c | 2020-11-02T08:00:25Z | python | 2020-11-18T08:01:41Z |
closed | apache/airflow | https://github.com/apache/airflow | 12,001 | ["airflow/providers/snowflake/transfers/s3_to_snowflake.py", "tests/providers/snowflake/transfers/test_s3_to_snowflake.py"] | S3ToSnowflakeTransfer enforces usage of schema parameter | Unlike all other Snowflake operators, S3ToSnowflakeTransfer requires the usage of schema instead of letting it be an optional variable that when passed would be used, and if not, then connection metadata is used.
I can be assigned on this if needed. Keeping this up here so this doesn't go unnoticed. | https://github.com/apache/airflow/issues/12001 | https://github.com/apache/airflow/pull/15817 | caddbf3aa04096033502d7da7eafaf830737a9b9 | 6f956dc99b6c6393f7b50e9da9f778b5cf0bef88 | 2020-10-31T16:03:14Z | python | 2021-05-13T11:48:17Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,989 | [".pre-commit-config.yaml", "BREEZE.rst", "STATIC_CODE_CHECKS.rst", "breeze-complete", "docs/apache-airflow/index.rst", "docs/apache-airflow/migrations-ref.rst", "scripts/ci/pre_commit/pre_commit_migration_documented.py"] | Add Doc page containing list of Alembic DB Migrations | It would be nice to have a docs page on https://airflow.readthedocs.io/en/latest/ that lists all the alembic migrations in Airflow.
Example:
| Revision ID | Revises ID | Airflow Version | Summary | Description |
|--------------|--------------|-----------------|-------------------------------------------------------------------------|---------------------------------------------------------------------------------|
| a66efa278eea | 952da73b5eff | 1.10.12 | Add Precision to `execution_date` in `RenderedTaskInstanceFields` table | This only affects MySQL. The `execution_date` column was missing time precision |
| da3f683c3a5a | a66efa278eea | 1.10.12 | Add `dag_hash` Column to `serialized_dag` table | Creates a new table `task_reschedule` for Sensors |
This makes it easier when upgrading and helpful in general to check the DB changes.
**List of all DB Migrations (until Airflow 1.10.12)**:
```
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
INFO [alembic.runtime.migration] Running upgrade -> e3a246e0dc1, current schema
INFO [alembic.runtime.migration] Running upgrade e3a246e0dc1 -> 1507a7289a2f, create is_encrypted
INFO [alembic.runtime.migration] Running upgrade 1507a7289a2f -> 13eb55f81627, maintain history for compatibility with earlier migrations
INFO [alembic.runtime.migration] Running upgrade 13eb55f81627 -> 338e90f54d61, More logging into task_instance
INFO [alembic.runtime.migration] Running upgrade 338e90f54d61 -> 52d714495f0, job_id indices
INFO [alembic.runtime.migration] Running upgrade 52d714495f0 -> 502898887f84, Adding extra to Log
INFO [alembic.runtime.migration] Running upgrade 502898887f84 -> 1b38cef5b76e, add dagrun
INFO [alembic.runtime.migration] Running upgrade 1b38cef5b76e -> 2e541a1dcfed, task_duration
INFO [alembic.runtime.migration] Running upgrade 2e541a1dcfed -> 40e67319e3a9, dagrun_config
INFO [alembic.runtime.migration] Running upgrade 40e67319e3a9 -> 561833c1c74b, add password column to user
INFO [alembic.runtime.migration] Running upgrade 561833c1c74b -> 4446e08588, dagrun start end
INFO [alembic.runtime.migration] Running upgrade 4446e08588 -> bbc73705a13e, Add notification_sent column to sla_miss
INFO [alembic.runtime.migration] Running upgrade bbc73705a13e -> bba5a7cfc896, Add a column to track the encryption state of the 'Extra' field in connection
INFO [alembic.runtime.migration] Running upgrade bba5a7cfc896 -> 1968acfc09e3, add is_encrypted column to variable table
INFO [alembic.runtime.migration] Running upgrade 1968acfc09e3 -> 2e82aab8ef20, rename user table
INFO [alembic.runtime.migration] Running upgrade 2e82aab8ef20 -> 211e584da130, add TI state index
INFO [alembic.runtime.migration] Running upgrade 211e584da130 -> 64de9cddf6c9, add task fails journal table
INFO [alembic.runtime.migration] Running upgrade 64de9cddf6c9 -> f2ca10b85618, add dag_stats table
INFO [alembic.runtime.migration] Running upgrade f2ca10b85618 -> 4addfa1236f1, Add fractional seconds to mysql tables
INFO [alembic.runtime.migration] Running upgrade 4addfa1236f1 -> 8504051e801b, xcom dag task indices
INFO [alembic.runtime.migration] Running upgrade 8504051e801b -> 5e7d17757c7a, add pid field to TaskInstance
INFO [alembic.runtime.migration] Running upgrade 5e7d17757c7a -> 127d2bf2dfa7, Add dag_id/state index on dag_run table
INFO [alembic.runtime.migration] Running upgrade 127d2bf2dfa7 -> cc1e65623dc7, add max tries column to task instance
INFO [alembic.runtime.migration] Running upgrade cc1e65623dc7 -> bdaa763e6c56, Make xcom value column a large binary
INFO [alembic.runtime.migration] Running upgrade bdaa763e6c56 -> 947454bf1dff, add ti job_id index
INFO [alembic.runtime.migration] Running upgrade 947454bf1dff -> d2ae31099d61, Increase text size for MySQL (not relevant for other DBs' text types)
INFO [alembic.runtime.migration] Running upgrade d2ae31099d61 -> 0e2a74e0fc9f, Add time zone awareness
INFO [alembic.runtime.migration] Running upgrade d2ae31099d61 -> 33ae817a1ff4, kubernetes_resource_checkpointing
INFO [alembic.runtime.migration] Running upgrade 33ae817a1ff4 -> 27c6a30d7c24, kubernetes_resource_checkpointing
INFO [alembic.runtime.migration] Running upgrade 27c6a30d7c24 -> 86770d1215c0, add kubernetes scheduler uniqueness
INFO [alembic.runtime.migration] Running upgrade 86770d1215c0, 0e2a74e0fc9f -> 05f30312d566, merge heads
INFO [alembic.runtime.migration] Running upgrade 05f30312d566 -> f23433877c24, fix mysql not null constraint
INFO [alembic.runtime.migration] Running upgrade f23433877c24 -> 856955da8476, fix sqlite foreign key
INFO [alembic.runtime.migration] Running upgrade 856955da8476 -> 9635ae0956e7, index-faskfail
INFO [alembic.runtime.migration] Running upgrade 9635ae0956e7 -> dd25f486b8ea, add idx_log_dag
INFO [alembic.runtime.migration] Running upgrade dd25f486b8ea -> bf00311e1990, add index to taskinstance
INFO [alembic.runtime.migration] Running upgrade 9635ae0956e7 -> 0a2a5b66e19d, add task_reschedule table
INFO [alembic.runtime.migration] Running upgrade 0a2a5b66e19d, bf00311e1990 -> 03bc53e68815, merge_heads_2
INFO [alembic.runtime.migration] Running upgrade 03bc53e68815 -> 41f5f12752f8, add superuser field
INFO [alembic.runtime.migration] Running upgrade 41f5f12752f8 -> c8ffec048a3b, add fields to dag
INFO [alembic.runtime.migration] Running upgrade c8ffec048a3b -> dd4ecb8fbee3, Add schedule interval to dag
INFO [alembic.runtime.migration] Running upgrade dd4ecb8fbee3 -> 939bb1e647c8, task reschedule fk on cascade delete
INFO [alembic.runtime.migration] Running upgrade 939bb1e647c8 -> 6e96a59344a4, Make TaskInstance.pool not nullable
INFO [alembic.runtime.migration] Running upgrade 6e96a59344a4 -> d38e04c12aa2, add serialized_dag table
INFO [alembic.runtime.migration] Running upgrade d38e04c12aa2 -> b3b105409875, add root_dag_id to DAG
INFO [alembic.runtime.migration] Running upgrade 6e96a59344a4 -> 74effc47d867, change datetime to datetime2(6) on MSSQL tables
INFO [alembic.runtime.migration] Running upgrade 939bb1e647c8 -> 004c1210f153, increase queue name size limit
INFO [alembic.runtime.migration] Running upgrade c8ffec048a3b -> a56c9515abdc, Remove dag_stat table
INFO [alembic.runtime.migration] Running upgrade a56c9515abdc, 004c1210f153, 74effc47d867, b3b105409875 -> 08364691d074, Merge the four heads back together
INFO [alembic.runtime.migration] Running upgrade 08364691d074 -> fe461863935f, increase_length_for_connection_password
INFO [alembic.runtime.migration] Running upgrade fe461863935f -> 7939bcff74ba, Add DagTags table
INFO [alembic.runtime.migration] Running upgrade 7939bcff74ba -> a4c2fd67d16b, add pool_slots field to task_instance
INFO [alembic.runtime.migration] Running upgrade a4c2fd67d16b -> 852ae6c715af, Add RenderedTaskInstanceFields table
INFO [alembic.runtime.migration] Running upgrade 852ae6c715af -> 952da73b5eff, add dag_code table
INFO [alembic.runtime.migration] Running upgrade 952da73b5eff -> a66efa278eea, Add Precision to execution_date in RenderedTaskInstanceFields table
INFO [alembic.runtime.migration] Running upgrade a66efa278eea -> da3f683c3a5a, Add dag_hash Column to serialized_dag table
```
| https://github.com/apache/airflow/issues/11989 | https://github.com/apache/airflow/pull/16181 | 04454d512eb3f117154868668ed036d5e7697040 | cefa46a87146f2658cd723e777669a48cdc1391a | 2020-10-30T22:52:44Z | python | 2021-06-03T17:20:26Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,968 | ["BREEZE.rst", "airflow/providers/microsoft/azure/hooks/wasb.py", "breeze", "scripts/docker/install_airflow.sh", "scripts/in_container/_in_container_utils.sh", "setup.py", "tests/providers/microsoft/azure/hooks/test_wasb.py", "tests/providers/microsoft/azure/log/test_wasb_task_handler.py"] | Upgrade azure-storage-blob to >=12 | **Description**
It would be nice if the azure-storage-blob dependency for apache-airflow-providers-microsoft-azure was upgraded to a more recent version that 2.1.0.
**Use case / motivation**
The version 12.4.0 will allow us to perform OAuth refresh token operations for long running tasks. We use this to authenticate against our datalake, and it's currently not possible with the version 2.1.0 that airflow uses.
**Related Issues**
The 12.x version is 100% incompatible with the 2.x versions. The API has changed drastically between the two versions, so all the airflow azure operators will need to be changed and fixed.
| https://github.com/apache/airflow/issues/11968 | https://github.com/apache/airflow/pull/12188 | a9ac2b040b64de1aa5d9c2b9def33334e36a8d22 | 94b1531230231c57610d720e59563ccd98e7ecb2 | 2020-10-30T13:15:18Z | python | 2021-01-23T12:52:13Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,967 | ["docs/static/exampleinclude.css"] | Docs: Integration documentation gets cut off | There are many wide tables in the documentation, for example here:
https://airflow.readthedocs.io/en/latest/operators-and-hooks-ref.html#id19
On a very wide browser the width is still fixed. It is not obvious that the table is cut off and I was very confused that I never found the operator reference documentation I was looking for.
<img width="1801" alt="Screenshot 2020-10-30 at 13 36 57" src="https://user-images.githubusercontent.com/72079612/97705839-fdd59300-1ab4-11eb-817f-62822d319474.png">
I would suggest either:
* refactoring the table into many separate tables either by provider or by column
* reformatting the tables into nested lists
* truncating the long token names that push the table wide
* choosing a theme that is responsive
I'm new to the project and docs so I don't want to force my formatting decisions, but I'm happy to make a PR if someone can suggest what to do. | https://github.com/apache/airflow/issues/11967 | https://github.com/apache/airflow/pull/12227 | cd85d01e703962e0c9b68b45f832f4b3f4649f1a | 75065ac35503c609b74bdd5085c9be911e7b6ce0 | 2020-10-30T12:42:36Z | python | 2020-11-10T09:33:27Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,965 | ["setup.py"] | Airflow fails to initdb with cattrs 1.1.0 | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
N/A
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Centos 7
- **Kernel** (e.g. `uname -a`): 3.10.0-229.el7.x86_64
- **Install tools**: pip 9.0.3 from /usr/lib/python3.6/site-packages (python 3.6)
- **Others**:
**What happened**:
Following the instructions [here](http://airflow.apache.org/docs/stable/start.html), I encountered an issue at the `airflow initdb` stage:
```
[xxx@xxx ~]# airflow initdb
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 26, in <module>
from airflow.bin.cli import CLIFactory
File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 94, in <module>
api_module = import_module(conf.get('cli', 'api_client')) # type: Any
File "/usr/lib64/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "/usr/local/lib/python3.6/site-packages/airflow/api/client/local_client.py", line 24, in <module>
from airflow.api.common.experimental import delete_dag
File "/usr/local/lib/python3.6/site-packages/airflow/api/common/experimental/delete_dag.py", line 26, in <module>
from airflow.models.serialized_dag import SerializedDagModel
File "/usr/local/lib/python3.6/site-packages/airflow/models/serialized_dag.py", line 35, in <module>
from airflow.serialization.serialized_objects import SerializedDAG
File "/usr/local/lib/python3.6/site-packages/airflow/serialization/serialized_objects.py", line 28, in <module>
import cattr
File "/usr/local/lib/python3.6/site-packages/cattr/__init__.py", line 1, in <module>
from .converters import Converter, GenConverter, UnstructureStrategy
File "/usr/local/lib/python3.6/site-packages/cattr/converters.py", line 16, in <module>
from attr import fields, resolve_types
ImportError: cannot import name 'resolve_types'
```
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
I expected the Airflow DB to be initialised as per the instructions.
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
On a fresh installation, follow the Quick Start guide until the `apache initdb` stage.
**Anything else we need to know**:
Investigation suggests that this was caused by a Python dependency; specifically `cattrs==1.1.0`, which was released yesterday (2020-10-29). Downgrading `cattrs` manually to 1.0.0 does fix the issue and allows the Airflow database to be initialised:
```
[xxx@xxx ~]# pip3 install cattrs==1.0.0
Collecting cattrs==1.0.0
Downloading https://files.pythonhosted.org/packages/17/5b/6afbdaeb066ecf8ca28d85851048103ac80bb169491a54a14bd39823c422/cattrs-1.0.0-py2.py3-none-any.whl
Requirement already satisfied: attrs>=17.3 in /usr/local/lib/python3.6/site-packages (from cattrs==1.0.0)
Installing collected packages: cattrs
Found existing installation: cattrs 1.1.0
Uninstalling cattrs-1.1.0:
Successfully uninstalled cattrs-1.1.0
Successfully installed cattrs-1.0.0
[xxx@xxx ~]# airflow initdb
DB: sqlite:////path/to/airflow.db
[2020-10-30 09:36:05,431] {db.py:378} INFO - Creating tables
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
INFO [alembic.runtime.migration] Running upgrade -> e3a246e0dc1, current schema
INFO [alembic.runtime.migration] Running upgrade e3a246e0dc1 -> 1507a7289a2f, create is_encrypted
/usr/local/lib/python3.6/site-packages/alembic/ddl/sqlite.py:44: UserWarning: Skipping unsupported ALTER for creation of implicit constraintPlease refer to the batch mode feature which allows for SQLite migrations using a copy-and-move strategy.
...
Done.
```
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/11965 | https://github.com/apache/airflow/pull/11969 | 9322f3e46c3a06d2e5b891e399cc054e9b76ae72 | 3ad037872e54ec617f1b2734781c61640c7528ca | 2020-10-30T09:48:15Z | python | 2020-10-30T15:14:49Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,943 | ["README.md"] | The README file in this repo has a bad link | The README file in this repo has a bad link
Status code [404:NotFound] - Link: https://cwiki.apache.org/confluence/display/AIRFLOW/Airflow+Links
This was found by an new experimental hobby project that I have just created: https://github.com/MrCull/GitHub-Repo-ReadMe-Dead-Link-Finder
If this has been in any way helpful then please consider giving the above Repo a Star.
| https://github.com/apache/airflow/issues/11943 | https://github.com/apache/airflow/pull/11945 | ba9c044d20ff784630a09eecc0a30029b0f5e199 | 96583e269419676d0d2aa1783a47a99ebdfd70d4 | 2020-10-29T13:31:15Z | python | 2020-10-29T16:26:58Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,937 | ["scripts/ci/libraries/_build_images.sh"] | Build automatically 2.0.0b and futher images in DockerHub automatically | The image should be automatically buld in Dockerhub after version tag for 2.* has been pushed automatically. | https://github.com/apache/airflow/issues/11937 | https://github.com/apache/airflow/pull/12050 | 577a41c203abfb41fb1157163cbac19102f764f5 | 5c199fbddfaf9f83915e84225313169a0486c3a6 | 2020-10-29T11:02:58Z | python | 2020-11-02T21:00:51Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,929 | ["airflow/example_dags/example_datetime_branch_operator.py", "airflow/operators/datetime_branch.py", "docs/apache-airflow/howto/operator/datetime_branch.rst", "docs/apache-airflow/howto/operator/index.rst", "tests/operators/test_datetime_branch.py"] | add DateTimeBranchOperator |
**Use case / motivation**
Airflow have `BranchSQLOperator` and `BranchPythonOperator`.
Airflow also have `DateTimeSensor`.
I find it very useful to have `DateTimeBranchOperator` that will allow to branch workflow based on specific DateTime
Sometimes in DAGs you want to do another set of tasks according to specific dates or time.
| https://github.com/apache/airflow/issues/11929 | https://github.com/apache/airflow/pull/11964 | 6851677a89294698cbdf9fa559bf9d12983c88e0 | 1e37a11e00c065e2dafa93dec9df5f024d0aabe5 | 2020-10-29T07:30:47Z | python | 2021-03-10T22:44:08Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,921 | ["airflow/providers/snowflake/example_dags/example_snowflake.py", "docs/howto/operator/index.rst", "docs/howto/operator/snowflake.rst", "docs/operators-and-hooks-ref.rst"] | Add how-to Guide for Snowflake operators | **Description**
A guide that describes how to use all the operators for Snowflake (https://github.com/apache/airflow/tree/master/airflow/providers/snowflake/operators) would be useful.
Other guides are available:
https://airflow.readthedocs.io/en/latest/howto/operator/index.html
Source code for those guides are at:
https://github.com/apache/airflow/tree/master/docs/howto/operator
Are you wondering how to start contributing to this project? Start by reading our [contributor guide](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
Best regards,
Kaxil
| https://github.com/apache/airflow/issues/11921 | https://github.com/apache/airflow/pull/11975 | 21350aa3cf8952b605713257ae94e1ed648dd00b | d363adb6187e9cba1d965f424c95058fa933df1f | 2020-10-28T14:42:23Z | python | 2020-10-31T23:06:34Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,917 | ["airflow/providers/postgres/example_dags/__init__.py", "airflow/providers/postgres/example_dags/example_postgres.py", "airflow/providers/postgres/provider.yaml", "docs/apache-airflow-providers-postgres/index.rst", "docs/apache-airflow-providers-postgres/operators/postgres_operator_howto_guide.rst", "tests/providers/postgres/operators/test_postgres_system.py"] | Add how-to Guide for Postgres operators | **Description**
A guide that describes how to use all the operators for Postgres (https://github.com/apache/airflow/tree/master/airflow/providers/postgres) would be useful.
Other guides are available:
https://airflow.readthedocs.io/en/latest/howto/operator/index.html
Source code for those guides are at:
https://github.com/apache/airflow/tree/master/docs/howto/operator
Are you wondering how to start contributing to this project? Start by reading our [contributor guide](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
Best regards,
Kaxil
| https://github.com/apache/airflow/issues/11917 | https://github.com/apache/airflow/pull/13281 | e7dbed2ae701bd0eff228ac0dc4787c36f885e32 | 9c75ea3c14b71d2f96d997aeef68c764c7d2984c | 2020-10-28T14:04:25Z | python | 2021-01-06T21:49:25Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,916 | ["airflow/providers/databricks/example_dags/example_databricks.py", "docs/howto/operator/databricks.rst", "docs/howto/operator/index.rst", "docs/operators-and-hooks-ref.rst"] | Add how-to Guide for Databricks Operators | **Description**
A guide that describes how to use all the operators for Azure (https://github.com/apache/airflow/tree/master/airflow/providers/databricks/operators) would be useful.
Other guides are available:
https://airflow.readthedocs.io/en/latest/howto/operator/index.html
Source code for those guides are at:
https://github.com/apache/airflow/tree/master/docs/howto/operator
Are you wondering how to start contributing to this project? Start by reading our [contributor guide](https://github.com/apache/airflow/blob/master/CONTRIBUTING.rst)
Best regards,
Kaxil
| https://github.com/apache/airflow/issues/11916 | https://github.com/apache/airflow/pull/12175 | 57b273a0b1b8af30ed017c2b24c498deb9010247 | 7e0d08e1f074871307f0eb9e9ae7a66f7ce67626 | 2020-10-28T14:01:30Z | python | 2020-11-09T11:26:35Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,901 | ["airflow/dag_processing/processor.py", "airflow/models/dag.py", "tests/dag_processing/test_processor.py", "tests/www/views/test_views_home.py"] | DAGs remain in the UI after renaming the dag_id in the same python file | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.12
**What happened**:
When I rename the dag_id, the new dag_id shows up in the UI but the old dag_id does not disappear and remains in the UI.
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
When Airflow [lists out the python files](https://github.com/apache/airflow/blob/1.10.12/airflow/utils/file.py#L105-L161) and tries to [deactivate the deleted dags during dag processing](https://github.com/apache/airflow/blob/1.10.12/airflow/utils/dag_processing.py#L951-L955), the old DAG's file location is still in the list of the alive DAG location because the new DAG is now defined in the old DAG's python file. Even when you manually set is_active to False in the metastore, the dag processing process will set it back to True.
This happens whether or not DAG serialization is enabled.
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
I was able to reproduce it by following these steps:
1. Add a python file in the dags folder and initialize the DAG
2. Verify the dag shows up in the UI
3. Change the dag_id in the same file to something else
4. Verify the new dag shows up in the UI
5. Both the old DAG and the new DAG are visible on the UI as well as the metadata db.
```
➜ dags pwd
/Users/alan/projects/astro/dags
➜ dags ls
wow.py
```
```python
from datetime import datetime
from airflow.models import DAG
dag = DAG(
dag_id='yay',
schedule_interval='@once',
start_date=datetime(2020, 1, 1),
catchup=False
)
```
In the screenshot, both DAGs are marked as active but there only one DAG defined in `wow.py`.
<img width="1502" alt="Screen Shot 2020-10-27 at 6 31 40 PM" src="https://user-images.githubusercontent.com/5952735/97380092-8cfb6480-1883-11eb-8315-1b386ffb00d8.png">
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/11901 | https://github.com/apache/airflow/pull/17121 | dc94ee26653ee4d3446210520036cc1f0eecfd81 | e81f14b85e2609ce0f40081caa90c2a6af1d2c65 | 2020-10-28T01:41:18Z | python | 2021-09-18T19:52:54Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,874 | ["airflow/providers/google/cloud/transfers/mssql_to_gcs.py", "tests/providers/google/cloud/transfers/test_mssql_to_gcs.py"] | MSSQLToBigQuery incorrect type conversion bit-->integer | **Apache Airflow version**: 1.10.12
**Environment**: Centos 7 Host
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu 20.04 Docker Container
- **Kernel** (e.g. `uname -a`): Linux c6c6e8230c17 3.10.0-1127.18.2.el7.x86_64 #1 SMP Sun Jul 26 15:27:06 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
- **Install tools**: Docker Compose
- **Others** ?
**What happened**:
The schema file generated by MSSQLToGCSOperator specifies that a field defined as **bit** in MSSQL, is defined as **INTEGER**. However, the JSON file defines the value as **true**/**false**. When attempting to import into BigQuery using the GCSOperatorToBigQuery operator, the job fails:
`Error while reading data, error message: JSON parsing error in row starting at position 0: Could not convert value 'boolean_value: false' to integer. Field: manager; Value: 0`
**What you expected to happen**:
Value should be defined as **BOOLEAN** in the schema.
**How to reproduce it**:
Ensure `dbo.Customers` table has a **bit** column:
```
export_customers = MsSqlToGoogleCloudStorageOperator(
task_id='export_customers',
sql='SELECT * FROM dbo.Customers;',
bucket='mssql-export',
filename='data/customers/export.json',
schema_filename='schemas/export.json',
mssql_conn_id='mssql_default',
google_cloud_storage_conn_id='google_cloud_default',
dag=dag
)
``` | https://github.com/apache/airflow/issues/11874 | https://github.com/apache/airflow/pull/29902 | 5a632f78eb6e3dcd9dc808e73b74581806653a89 | 035ad26d79848c63049307a94c04a9a3916d8a38 | 2020-10-27T03:17:50Z | python | 2023-03-04T22:57:55Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,871 | ["setup.py"] | Airflow 2.0.0alpha2 - Webserver Fails because of werkzeug | With alpha2 without using the constraints https://github.com/apache/airflow/blob/constraints-master/constraints-3.6.txt , when accessing Webserver it fails with
```
Something bad has happened.
Please consider letting us know by creating a bug report using GitHub.
Python version: 3.6.12
Airflow version: 2.0.0a2
Node: 599861bd62ac
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1953, in full_dispatch_request
return self.finalize_request(rv)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1970, in finalize_request
response = self.process_response(response)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 2269, in process_response
self.session_interface.save_session(self, ctx.session, response)
File "/usr/local/lib/python3.6/site-packages/flask/sessions.py", line 387, in save_session
samesite=samesite,
File "/usr/local/lib/python3.6/site-packages/werkzeug/wrappers/base_response.py", line 479, in set_cookie
samesite=samesite,
File "/usr/local/lib/python3.6/site-packages/werkzeug/http.py", line 1217, in dump_cookie
raise ValueError("SameSite must be 'Strict', 'Lax', or 'None'.")
ValueError: SameSite must be 'Strict', 'Lax', or 'None'.
``` | https://github.com/apache/airflow/issues/11871 | https://github.com/apache/airflow/pull/11872 | 6788428a8636e7ab15e556f8e48bcd82a1a5470e | afbdc422dafa987de31afb71185754a0408da5c7 | 2020-10-27T00:37:14Z | python | 2020-10-27T00:42:26Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,870 | ["airflow/decorators/__init__.py", "airflow/decorators/task_group.py", "airflow/example_dags/example_task_group_decorator.py", "airflow/utils/task_group.py", "docs/apache-airflow/concepts.rst", "docs/spelling_wordlist.txt", "tests/utils/test_task_group.py"] | Add @taskgroup decorator | **Description**
Similarly to `@dag` and `@task` decorators, we can add a decorator to easily generate `TaskGroup` instances.
**Use case / motivation**
This would be used for reusable pieces of DAGs. An example may be to hyperparameter tune an ML pipeline. You may want to generate several parallel subpipelines with different settings/parameters to be executed in parallel. At the end pull the result and decide the best parametrization.
```py
@task
collect_dataset(...)
pass
@task
def train_model(...)
pass
@task_group
def movielens_model_pipeline(learning_rate: int, feature_list: list, dataset: XComArg):
dataset = filter_dataset(dataset, feature_list)
train_model(dataset, learning_rate)
@dag
def movielens_hpt(dataset_path: str, feature_list:list=['year', 'director']):
dataset = load_dataset(dataset_path)
for i in range(0.1, 0.9, step=0.1):
movielens_model_pipeline(i, feature_list, dataset)
decide_best_model(...)
```
**Related Issues**
https://github.com/apache/airflow/issues?q=is%3Aopen+is%3Aissue+label%3AAIP-31
| https://github.com/apache/airflow/issues/11870 | https://github.com/apache/airflow/pull/15034 | ee2d6c5c0fa9fdce8fd5163f6d5bf40f46fa4c3f | da897c926112c2bfdc8418e7235c9c7170649ae4 | 2020-10-26T23:51:33Z | python | 2021-04-01T16:45:58Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,858 | ["airflow/models/taskinstance.py", "tests/models/test_taskinstance.py"] | Xcom_pull results order is not deterministic | For xcoms pushed from different tasks, having the same execution_date and in the same Dag, the result of the xcom_pull with key set as None is not deterministic in its order
**Apache Airflow version**: 2.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
When tasks in the same dag push Xcoms, the result of the xcom_pull comes in any order. The order is not deterministic.
**What you expected to happen**:
I expected the order to be deterministic.
**How to reproduce it**:
Run the provided example_xcom dag in the repository multiple times
| https://github.com/apache/airflow/issues/11858 | https://github.com/apache/airflow/pull/12905 | 85dd092a8ee7c760c972a45c2cc94289d9155f03 | ff25bd6ffed53de651254f7c3e6d254e4191bfc7 | 2020-10-26T16:08:26Z | python | 2020-12-08T13:00:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,795 | ["airflow/www/static/css/dags.css", "airflow/www/static/css/main.css", "airflow/www/templates/airflow/dags.html", "airflow/www/webpack.config.js", "docs/img/dags.png"] | links wrapping on standard-width browser | on my macbook pro w/ chrome full width…
<img width="1423" alt="Screen Shot 2020-10-23 at 3 01 04 PM" src="https://user-images.githubusercontent.com/4283/97044480-2ce28680-1542-11eb-9a28-cea71e283678.png">
if i bump down to 90%...
<img width="1435" alt="Screen Shot 2020-10-23 at 3 02 08 PM" src="https://user-images.githubusercontent.com/4283/97044866-bf832580-1542-11eb-9e72-1660a52c49f2.png">
so i wonder if we should:
A) reduce font size, or
B) roll the links up under a gear-icon-dropdown (which would allow us to put words next to the icons, which isn't all bad)
| https://github.com/apache/airflow/issues/11795 | https://github.com/apache/airflow/pull/11866 | 164a7078b89e349cc7d23651542562309983f25e | b4b90da0feef6352555c2bb4cc8f913048090869 | 2020-10-23T19:14:22Z | python | 2020-10-29T21:48:58Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,787 | ["airflow/jobs/scheduler_job.py"] | Database locked issue | **Environment**:
Breeze Sqlite version
**What happened**:
With the current master version of Airflow there is a problem with locking the database when running default Airflow configuration in Sqlite in Breeze.
## How to reproduce
```./breeze start-airflow --load-example-dags```
Start running/clicking on DAGS, You will soon start getting errors:
```
he above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/opt/airflow/airflow/www/decorators.py", line 117, in wrapper
return f(self, *args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/security/decorators.py", line 109, in wraps
return f(self, *args, **kwargs)
File "/opt/airflow/airflow/www/decorators.py", line 98, in view_func
return f(*args, **kwargs)
File "/opt/airflow/airflow/www/decorators.py", line 59, in wrapper
session.add(log)
File "/usr/local/lib/python3.6/contextlib.py", line 88, in __exit__
next(self.gen)
File "/opt/airflow/airflow/utils/session.py", line 31, in create_session
session.commit()
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 1042, in commit
self.transaction.commit()
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 504, in commit
self._prepare_impl()
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl
self.session.flush()
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2523, in flush
self._flush(objects)
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2664, in _flush
transaction.rollback(_capture_exception=True)
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/util/langhelpers.py", line 69, in __exit__
exc_value, with_traceback=exc_tb,
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/session.py", line 2624, in _flush
flush_context.execute()
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
rec.execute(self)
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/unitofwork.py", line 589, in execute
uow,
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 245, in save_obj
insert,
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/orm/persistence.py", line 1136, in _emit_insert_statements
statement, params
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
return meth(self, multiparams, params)
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1130, in _execute_clauseelement
distilled_params,
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1317, in _execute_context
e, statement, parameters, cursor, context
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1511, in _handle_dbapi_exception
sqlalchemy_exception, with_traceback=exc_info[2], from_=e
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) database is locked
[SQL: INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES (?, ?, ?, ?, ?, ?, ?)]
[parameters: ('2020-10-23 16:35:08.015990', 'example_branch_operator', None, 'tree', None, 'admin', "[('dag_id', 'example_branch_operator')]")]
(Background on this error at: http://sqlalche.me/e/13/e3q8)
```
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/11787 | https://github.com/apache/airflow/pull/11797 | 24e21528646e05d2776e01ee48554b4990670a20 | affee2938e413fcd3d73c6f09a00d9b207eae2d4 | 2020-10-23T16:39:12Z | python | 2020-10-23T21:21:03Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,721 | ["airflow/providers/google/cloud/hooks/dataflow.py", "airflow/providers/google/cloud/operators/dataflow.py", "tests/providers/google/cloud/hooks/test_dataflow.py", "tests/providers/google/cloud/operators/test_dataflow.py"] | Dataflow operators - add user possibility to define expected terminal state | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
Related to the discussion here: https://github.com/apache/airflow/pull/8553#discussion_r416136954
Allow user to define expected terminal state. e.g. `JOB_STATE_CANCELLED` could be not treated as failure but expected terminal state.
<!-- A short description of your feature -->
| https://github.com/apache/airflow/issues/11721 | https://github.com/apache/airflow/pull/34217 | 25d463c3e33f8628e1bcbe4dc6924693ec141dc0 | 050a47add822cde6d9abcd609df59c98caae13b0 | 2020-10-21T14:35:58Z | python | 2023-09-11T10:55:54Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,717 | ["airflow/cli/commands/task_command.py", "tests/cli/commands/test_task_command.py"] | All task logging goes to the log for try_number 1 | **Apache Airflow version**: 2.0.0a1
**What happened**:
When a task fails on the first try, the log output for additional tries go to the log for the first attempt.
**What you expected to happen**:
The logs should go to the correct log file. For the default configuration, the log filename template is `log_filename_template = {{ ti.dag_id }}/{{ ti.task_id }}/{{ ts }}/{{ try_number }}.log`, so additional numbered `.log` files should be created.
**How to reproduce it**:
Create a test dag:
```
from datetime import timedelta
from airflow import DAG
from airflow.operators.python import PythonOperator
from airflow.utils.dates import days_ago
with DAG(
dag_id="trynumber_demo",
default_args={"start_date": days_ago(2), "retries": 1, "retry_delay": timedelta(0)},
schedule_interval=None,
) as dag:
def demo_task(ti=None):
print("Running demo_task, try_number =", ti.try_number)
if ti.try_number <= 1:
raise ValueError("Shan't")
task = PythonOperator(task_id="demo_task", python_callable=demo_task)
```
and trigger this dag:
```
$ airflow dags trigger trynumber_demo
```
then observe that `triggernumber_demo/demo_task/<execution_date>/` only contains 1.log, which contains the full output for 2 runs:
```
[...]
--------------------------------------------------------------------------------
[2020-10-21 13:29:07,958] {taskinstance.py:1020} INFO - Starting attempt 1 of 2
[2020-10-21 13:29:07,959] {taskinstance.py:1021} INFO -
--------------------------------------------------------------------------------
[...]
[2020-10-21 13:29:08,163] {logging_mixin.py:110} INFO - Running demo_task, try_number = 1
[2020-10-21 13:29:08,164] {taskinstance.py:1348} ERROR - Shan't
Traceback (most recent call last):
[...]
ValueError: Shan't
[2020-10-21 13:29:08,168] {taskinstance.py:1392} INFO - Marking task as UP_FOR_RETRY. dag_id=trynumber_demo, task_id=demo_task, execution_date=20201021T122907, start_date=20201021T122907, end_date=20201021T122908
[...]
[2020-10-21 13:29:09,121] {taskinstance.py:1019} INFO -
--------------------------------------------------------------------------------
[2020-10-21 13:29:09,121] {taskinstance.py:1020} INFO - Starting attempt 2 of 2
[2020-10-21 13:29:09,121] {taskinstance.py:1021} INFO -
--------------------------------------------------------------------------------
[...]
[2020-10-21 13:29:09,333] {logging_mixin.py:110} INFO - Running demo_task, try_number = 2
[2020-10-21 13:29:09,334] {python.py:141} INFO - Done. Returned value was: None
[2020-10-21 13:29:09,355] {taskinstance.py:1143} INFO - Marking task as SUCCESS.dag_id=trynumber_demo, task_id=demo_task, execution_date=20201021T122907, start_date=20201021T122909, end_date=20201021T122909
[2020-10-21 13:29:09,404] {local_task_job.py:117} INFO - Task exited with return code 0
```
The `TaskInstance()` created for the run needs to first be refreshed from the database, before setting the logging context. | https://github.com/apache/airflow/issues/11717 | https://github.com/apache/airflow/pull/11723 | eba1d91b35e621c68fa57d41eb1eb069253d90c7 | 0eaa6887967f827c2b05688d56ede0b254254297 | 2020-10-21T12:38:38Z | python | 2020-10-22T10:31:52Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,706 | ["chart/templates/_helpers.yaml", "chart/templates/flower/flower-deployment.yaml", "chart/templates/secrets/flower-secret.yaml", "chart/tests/test_flower_authorization.py", "chart/values.schema.json", "chart/values.yaml"] | Authorization for Flower in the Helm Chart | Hello,
The documentation for Helm Chart does not describe how to set up [Flower authorization](https://airflow.readthedocs.io/en/latest/security/flower.html?highlight=flower#flower-authentication) with a password. It would be useful if such documentation was added. It is possible that this will require changes to the Helm Chart to pass the required secrets.
Best regards,
Kamil Breguła | https://github.com/apache/airflow/issues/11706 | https://github.com/apache/airflow/pull/11836 | e238b882a8568532829be80e96e54856d7a0018d | 644ac1b06019bcd1c1c540373051c31b766efccf | 2020-10-21T01:40:24Z | python | 2020-10-31T17:37:45Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,704 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "chart/templates/webserver/webserver-deployment.yaml", "chart/tests/test_git_sync_webserver.py", "chart/values.schema.json", "chart/values.yaml", "docs/apache-airflow/dag-serialization.rst", "docs/helm-chart/manage-dags-files.rst"] | Disable DAG sync when DAG serialization is enabled in the Helm Chart | Hello,
In order to optimize costs and to improve system security, I think it is worth adding the option to disable DAG synchronization in the web container, if the user has enabled DAG synchronization.
> With DAG Serialization we aim to decouple the webserver from DAG parsing which would make the Webserver very light-weight.
https://airflow.readthedocs.io/en/latest/dag-serialization.html
Best regards,
Kamil Breguła | https://github.com/apache/airflow/issues/11704 | https://github.com/apache/airflow/pull/15314 | 6f8ab9e6da9b7ccd898e91f0bdf5311c7f1b8336 | 30c6300c6b28554786245ddcd0da969be44979f7 | 2020-10-21T01:31:50Z | python | 2021-04-12T16:59:49Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,697 | ["airflow/cli/commands/celery_command.py", "airflow/executors/celery_executor.py", "tests/cli/commands/test_celery_command.py"] | Celery worker hostname no longer supports %h, %n & %d patterns | The recent [refactoring of the celery Worker process](https://github.com/apache/airflow/commit/02ce45cafec22a0a80257b2144d99ec8bb41c961) dropped the use of `celery.bin.worker.Worker()`, and switched to using the `celery.app.worker.Worker()` class more directly.
Unfortunately, it was the task of the former to format the [Celery hostname patterns](https://docs.celeryproject.org/en/latest/reference/cli.html?highlight=hostname#cmdoption-celery-worker-n).
Unfortunately this refactoring dropped support for hostname patterns (`%h`, `%n`, etc.). Can we re-introduce these? All that is needed is a few `celery.utils.nodenames` calls:
```python
from celery.utils.nodenames import default_nodename, host_format
# ...
'hostname': host_format(default_nodename(args.celery_hostname)),
``` | https://github.com/apache/airflow/issues/11697 | https://github.com/apache/airflow/pull/11698 | 3caa539092d3a4196083d1db829fa1ed7d83fa95 | 7ef0b3c929e9a45f9e0e91aded82c7254201aee2 | 2020-10-20T20:32:30Z | python | 2020-10-21T09:14:10Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,695 | ["Dockerfile"] | Master production docker image does not yet have providers installed. | For some reason despite #11490 the docker image built on DockerHub does not have providers folder. | https://github.com/apache/airflow/issues/11695 | https://github.com/apache/airflow/pull/11738 | 1da8379c913843834353b44861c62f332a461bdf | eba1d91b35e621c68fa57d41eb1eb069253d90c7 | 2020-10-20T16:26:50Z | python | 2020-10-22T09:02:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,676 | [".github/workflows/build-images-workflow-run.yml", ".github/workflows/ci.yml", "scripts/ci/tools/ci_free_space_on_ci.sh", "tests/cli/commands/test_jobs_command.py", "tests/jobs/test_scheduler_job.py", "tests/models/test_taskinstance.py", "tests/test_utils/asserts.py"] | Flaky scheduler test | We have flaky scheduler test:
https://github.com/apache/airflow/pull/11659/checks?check_run_id=1279791496#step:6:527
_______________ TestSchedulerJob.test_scheduler_task_start_date ________________
self = <tests.jobs.test_scheduler_job.TestSchedulerJob testMethod=test_scheduler_task_start_date>
def test_scheduler_task_start_date(self):
"""
Test that the scheduler respects task start dates that are different from DAG start dates
"""
dagbag = DagBag(dag_folder=os.path.join(settings.DAGS_FOLDER, "no_dags.py"), include_examples=False)
dag_id = 'test_task_start_date_scheduling'
dag = self.dagbag.get_dag(dag_id)
dag.is_paused_upon_creation = False
dagbag.bag_dag(dag=dag, root_dag=dag)
# Deactivate other dags in this file so the scheduler doesn't waste time processing them
other_dag = self.dagbag.get_dag('test_start_date_scheduling')
other_dag.is_paused_upon_creation = True
dagbag.bag_dag(dag=other_dag, root_dag=other_dag)
dagbag.sync_to_db()
scheduler = SchedulerJob(executor=self.null_exec,
subdir=dag.fileloc,
num_runs=2)
scheduler.run()
session = settings.Session()
tiq = session.query(TaskInstance).filter(TaskInstance.dag_id == dag_id)
ti1s = tiq.filter(TaskInstance.task_id == 'dummy1').all()
ti2s = tiq.filter(TaskInstance.task_id == 'dummy2').all()
self.assertEqual(len(ti1s), 0)
> self.assertEqual(len(ti2s), 2)
E AssertionError: 1 != 2 | https://github.com/apache/airflow/issues/11676 | https://github.com/apache/airflow/pull/14792 | 3f61df11e7e81abc0ac4495325ccb55cc1c88af4 | 45cf89ce51b203bdf4a2545c67449b67ac5e94f1 | 2020-10-20T09:35:46Z | python | 2021-03-18T13:01:10Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,666 | ["MANIFEST.in"] | PythonVirtualenvOperator fails to find python_virtualenv_script.jinja2 template |
**Apache Airflow version**: 2.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**: Docker
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
The PythonVirtualenvOperator failed because it couldn't find the python_virtualenv_script.jinja2 in jinja2 environment
**What you expected to happen**:
I expected it to run without errors
**How to reproduce it**:
Run the provided example dag:
```
from airflow import DAG
from airflow.operators.python import PythonVirtualenvOperator
from airflow.utils.dates import days_ago
dag = DAG(
dag_id='virtualenv_python_operator',
default_args={"owner: Airflow"},
schedule_interval="@once",
start_date=days_ago(2),
tags=['example']
)
def callable_virtualenv():
"""
Example function that will be performed in a virtual environment.
Importing at the module level ensures that it will not attempt to import the
library before it is installed.
"""
from time import sleep
from colorama import Back, Fore, Style
print(Fore.RED + 'some red text')
print(Back.GREEN + 'and with a green background')
print(Style.DIM + 'and in dim text')
print(Style.RESET_ALL)
for _ in range(10):
print(Style.DIM + 'Please wait...', flush=True)
sleep(10)
print('Finished')
virtualenv_task = PythonVirtualenvOperator(
task_id="virtualenv_python",
python_callable=callable_virtualenv,
requirements=[
"colorama==0.4.0"
],
system_site_packages=False,
dag=dag,
)
```
**Anything else we need to know**:
The above dag produces the error below in log:
```
[2020-10-19 19:15:03,694] {taskinstance.py:1348} ERROR - python_virtualenv_script.jinja2
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1087, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1209, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1254, in _execute_task
result = task_copy.execute(context=context)
File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 485, in execute
super().execute(context=serializable_context)
File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 140, in execute
return_value = self.execute_callable()
File "/usr/local/lib/python3.7/site-packages/airflow/operators/python.py", line 514, in execute_callable
filename=script_filename
File "/usr/local/lib/python3.7/site-packages/airflow/utils/python_virtualenv.py", line 92, in write_python_script
template = template_env.get_template('python_virtualenv_script.jinja2')
File "/usr/local/lib/python3.7/site-packages/jinja2/environment.py", line 883, in get_template
return self._load_template(name, self.make_globals(globals))
File "/usr/local/lib/python3.7/site-packages/jinja2/environment.py", line 857, in _load_template
template = self.loader.load(self, name, globals)
File "/usr/local/lib/python3.7/site-packages/jinja2/loaders.py", line 115, in load
source, filename, uptodate = self.get_source(environment, name)
File "/usr/local/lib/python3.7/site-packages/jinja2/loaders.py", line 197, in get_source
raise TemplateNotFound(template)
jinja2.exceptions.TemplateNotFound: python_virtualenv_script.jinja2
```
| https://github.com/apache/airflow/issues/11666 | https://github.com/apache/airflow/pull/11677 | c568c8886a69e0276b26d0f7b2f13d8fa528d64a | 72b644b89941878eaa8284dacb62744a087e826d | 2020-10-19T20:23:06Z | python | 2020-10-20T12:01:05Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,632 | [".rat-excludes", "docs/exts/docs_build/dev_index_template.html.jinja2", "docs/exts/operators_and_hooks_ref-transfers.rst.jinja2", "docs/exts/operators_and_hooks_ref.rst.jinja2", "pyproject.toml", "scripts/ci/dockerfiles/krb5-kdc-server/Dockerfile", "scripts/ci/dockerfiles/krb5-kdc-server/kadm5.acl", "scripts/ci/dockerfiles/presto/Dockerfile"] | Review and clarify licence/notice information | While reviewig the fix for #11544 I realised we might need to update (or better automate) our LICENSING information.
We currently Have NOTICE file in Airflow and a mixture of LICENSE + license directory.
NOTICE file here:
```
Apache Airflow
Copyright 2016-2019 The Apache Software Foundation
This product includes software developed at The Apache Software
Foundation (http://www.apache.org/).
=======================================================================
hue:
-----
This product contains a modified portion of 'Hue' developed by Cloudera, Inc.
(https://github.com/cloudera/hue/).
* Copyright 2009-2017 Cloudera Inc.
python-slugify:
---------------
* Copyright (c) Val Neekman @ Neekware Inc. http://neekware.com
python-nvd3:
------------
* Copyright (c) 2013 Arezqui Belaid <[email protected]> and other contributors
```
But also, the LICENSE file contains mentioning 3rd party components and MIT:
```
============================================================================
APACHE AIRFLOW SUBCOMPONENTS:
The Apache Airflow project contains subcomponents with separate copyright
notices and license terms. Your use of the source code for the these
subcomponents is subject to the terms and conditions of the following
licenses.
========================================================================
Third party Apache 2.0 licenses
========================================================================
The following components are provided under the Apache 2.0 License.
See project link for details. The text of each license is also included
at licenses/LICENSE-[project].txt.
(ALv2 License) hue v4.3.0 (https://github.com/cloudera/hue/)
(ALv2 License) jqclock v2.3.0 (https://github.com/JohnRDOrazio/jQuery-Clock-Plugin)
(ALv2 License) bootstrap3-typeahead v4.0.2 (https://github.com/bassjobsen/Bootstrap-3-Typeahead)
========================================================================
MIT licenses
========================================================================
The following components are provided under the MIT License. See project link for details.
The text of each license is also included at licenses/LICENSE-[project].txt.
(MIT License) jquery v3.4.1 (https://jquery.org/license/)
(MIT License) dagre-d3 v0.6.4 (https://github.com/cpettitt/dagre-d3)
(MIT License) bootstrap v3.2 (https://github.com/twbs/bootstrap/)
(MIT License) d3-tip v0.9.1 (https://github.com/Caged/d3-tip)
(MIT License) dataTables v1.10.20 (https://datatables.net)
(MIT License) Bootstrap Toggle v2.2.2 (http://www.bootstraptoggle.com)
(MIT License) normalize.css v3.0.2 (http://necolas.github.io/normalize.css/)
(MIT License) ElasticMock v1.3.2 (https://github.com/vrcmarcos/elasticmock)
(MIT License) MomentJS v2.24.0 (http://momentjs.com/)
(MIT License) moment-strftime v0.5.0 (https://github.com/benjaminoakes/moment-strftime)
(MIT License) python-slugify v4.0.0 (https://github.com/un33k/python-slugify)
(MIT License) python-nvd3 v0.15.0 (https://github.com/areski/python-nvd3)
(MIT License) eonasdan-bootstrap-datetimepicker v4.17.37 (https://github.com/eonasdan/bootstrap-datetimepicker/)
========================================================================
BSD 3-Clause licenses
========================================================================
The following components are provided under the BSD 3-Clause license. See project links for details.
The text of each license is also included at licenses/LICENSE-[project].txt.
(BSD 3 License) d3 v5.15.0 (https://d3js.org)
```
But we also have the separate ``licenses`` directory where a number of licenses are placed:
```
LICENSE-bootstrap.txt
LICENSE-bootstrap3-typeahead.txt
LICENSE-d3-tip.txt
LICENSE-d3js.txt
LICENSE-dagre-d3.txt
LICENSE-datatables.txt
LICENSE-elasticmock.txt
LICENSE-eonasdan-bootstrap-datetimepicker.txt
LICENSE-flask-kerberos.txt
LICENSE-hue.txt
LICENSE-jqclock.txt
LICENSE-jquery.txt
LICENSE-moment.txt
LICENSE-moment-strftime.txt
LICENSE-normalize.txt
LICENSE-python-nvd3.txt
LICENSE-python-slugify.txt
```
I think we should just choose one way of reporting licences and (if possible) automate it.
| https://github.com/apache/airflow/issues/11632 | https://github.com/apache/airflow/pull/12922 | bfbd4bbb706d8c358f310c98470a613090d709ad | 738c9536773c41641f3792b422daeff711111bb4 | 2020-10-18T12:35:42Z | python | 2020-12-08T16:32:29Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,622 | ["airflow/cli/commands/celery_command.py", "airflow/executors/celery_executor.py", "tests/cli/commands/test_celery_command.py"] | airflow celery worker -D does not process tasks | **Apache Airflow version**:
2.0 (master)
**Environment**:
```
./breeze --python 3.7 --db-reset restart --integration redis
```
**What happened**:
Running `airflow celery worker -D` spawns celery worker (`ps -aux`) but the worker does not consume any tasks 👀
Worker logs show:
```
2020-10-17 22:57:20,845 INFO - Connected to redis://redis:6379/0
2020-10-17 22:57:20,858 INFO - mingle: searching for neighbors
2020-10-17 22:57:20,865 WARNING - consumer: Connection to broker lost. Trying to re-establish the connection...
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 318, in start
blueprint.start(self)
File "/usr/local/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
step.start(parent)
File "/usr/local/lib/python3.7/site-packages/celery/worker/consumer/mingle.py", line 40, in start
self.sync(c)
File "/usr/local/lib/python3.7/site-packages/celery/worker/consumer/mingle.py", line 44, in sync
replies = self.send_hello(c)
File "/usr/local/lib/python3.7/site-packages/celery/worker/consumer/mingle.py", line 57, in send_hello
replies = inspect.hello(c.hostname, our_revoked._data) or {}
File "/usr/local/lib/python3.7/site-packages/celery/app/control.py", line 156, in hello
return self._request('hello', from_node=from_node, revoked=revoked)
File "/usr/local/lib/python3.7/site-packages/celery/app/control.py", line 106, in _request
pattern=self.pattern, matcher=self.matcher,
File "/usr/local/lib/python3.7/site-packages/celery/app/control.py", line 480, in broadcast
limit, callback, channel=channel,
File "/usr/local/lib/python3.7/site-packages/kombu/pidbox.py", line 352, in _broadcast
channel=chan)
File "/usr/local/lib/python3.7/site-packages/kombu/pidbox.py", line 391, in _collect
self.connection.drain_events(timeout=timeout)
File "/usr/local/lib/python3.7/site-packages/kombu/connection.py", line 324, in drain_events
return self.transport.drain_events(self.connection, **kwargs)
File "/usr/local/lib/python3.7/site-packages/kombu/transport/virtual/base.py", line 963, in drain_events
get(self._deliver, timeout=timeout)
File "/usr/local/lib/python3.7/site-packages/kombu/transport/redis.py", line 369, in get
self._register_BRPOP(channel)
File "/usr/local/lib/python3.7/site-packages/kombu/transport/redis.py", line 308, in _register_BRPOP
self._register(*ident)
File "/usr/local/lib/python3.7/site-packages/kombu/transport/redis.py", line 292, in _register
self.poller.register(sock, self.eventflags)
File "/usr/local/lib/python3.7/site-packages/kombu/utils/eventio.py", line 67, in register
self._epoll.register(fd, events)
OSError: [Errno 22] Invalid argument
2020-10-17 22:57:20,867 CRITICAL - Frequent restarts detected: RestartFreqExceeded('5 in 1s')
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 312, in start
self._restart_state.step()
File "/usr/local/lib/python3.7/site-packages/billiard/common.py", line 165, in step
raise self.RestartFreqExceeded("%r in %rs" % (R, self.maxT))
billiard.exceptions.RestartFreqExceeded: 5 in 1s
```
**What you expected to happen**:
I expect that demonized worker will consume Airflow tasks.
**How to reproduce it**:
```
./breeze --python 3.7 --db-reset restart --integration redis
airflow scheduler -D
airflow webserver -D -w 1
airflow celery worker -c 1 -D
```
Then trigger any DAG from webui.
**Anything else we need to know**:
Worked before: #11336
| https://github.com/apache/airflow/issues/11622 | https://github.com/apache/airflow/pull/11698 | 3caa539092d3a4196083d1db829fa1ed7d83fa95 | 7ef0b3c929e9a45f9e0e91aded82c7254201aee2 | 2020-10-17T21:39:21Z | python | 2020-10-21T09:14:10Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,618 | ["airflow/hooks/dbapi.py", "newsfragments/24570.feature.rst", "tests/hooks/test_dbapi.py", "tests/operators/test_sql.py"] | Support to turn off the sql echo at logging in dbapi_hook |
**Description**
through the parameter, like `mute=True`,
can turn off not to echo the SQL query
**Use case / motivation**
I hope we can turn on or off the option to echo or not echo of SQL query
First, if the query is too long, the log becomes unnecessarily heavy
And also sometimes we don't want it to be recorded because of like privacy issues.
| https://github.com/apache/airflow/issues/11618 | https://github.com/apache/airflow/pull/24570 | c1d621c7ce352cb900ff5fb7da214e1fbcf0a15f | 53284cf27260122ff0a56d397e677fb6ad667370 | 2020-10-17T15:33:29Z | python | 2022-07-04T20:46:34Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,591 | ["airflow/www/decorators.py", "tests/www/test_decorators.py"] | Unauthenticated access with RBAC to URL has_dag_access results lose redirection | <!--
**Apache Airflow version**: latest
**Environment**: Linux
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
Unauthenticated access with RBAC to URL has_dag_access results lose redirection
**What you expected to happen**:
Redirection maintained, user can login and redirected to proper page
Missing next URL parameter
**How to reproduce it**:
Setup airflow with RBAC
Have a user who hasn't log in access a URL that uses decorator has_dag_access, i.e:
http://localhost/graph?dag_id=dag_id
User will be sent to login page, (note the next= argument that required to redirect to original page is gone)
Login properly, user then will be sent to bad page
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
| https://github.com/apache/airflow/issues/11591 | https://github.com/apache/airflow/pull/11592 | 240c7d4d72aac8f6aab98f5913e8f54c4f1372ff | d7da4fca83e55c3d637f5704580d88453fe7fc24 | 2020-10-16T17:49:15Z | python | 2020-10-26T13:01:54Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,585 | [".github/workflows/ci.yml", "BREEZE.rst", "breeze", "dev/README.md", "scripts/in_container/run_prepare_provider_packages.sh"] | Pre-release packages for SVN should have "good" versions | When we prepare pre-release versions, they are not intended to be converted to final release versions, so there is no need to replace version number for them artificially,
For release candidates on the other hand, we should internally use the "final" version because those packages might be simply renamed to the final "production" versions. | https://github.com/apache/airflow/issues/11585 | https://github.com/apache/airflow/pull/11586 | 91898e8421fae65e1c2802ec24a6e0edd50229de | ae06ad01a20c249d98554ba63eb25af71c010cf5 | 2020-10-16T15:41:23Z | python | 2020-10-19T10:32:07Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,577 | ["provider_packages/setup_provider_packages.py"] | Provider dependencies do not work well with alphas. | The dependencies in Alphas are currently >= 2.0.0 and should be >=2.0.0a0 in order to work with Alphas in cases which are not PEP 440-compliant.
According to https://www.python.org/dev/peps/pep-0440/, >= 2.0.0 should also work with alpha/beta releases (a1/a2) but in some cases it does not (https://apache-airflow.slack.com/archives/C0146STM600/p1602774750041800)
Changing to ">=2.0.0a0" should help. | https://github.com/apache/airflow/issues/11577 | https://github.com/apache/airflow/pull/11578 | 8865d14df4d58dd5f1a4d2ff81c77469959f175a | 6f0bc0d81fe7884f1f8a0e2e83d65c768ff2c716 | 2020-10-16T14:09:14Z | python | 2020-10-16T14:29:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,563 | ["airflow/www/templates/airflow/dags.html"] | Strange behavior around the edge of the circles | Edge is flickery + sometimes I can't click the number

| https://github.com/apache/airflow/issues/11563 | https://github.com/apache/airflow/pull/11786 | 73743f3421a956f82e77f9221379c1c06a4b40e0 | 36a4d6c1dc231a91d476210d2114703c23a4eafb | 2020-10-15T20:11:25Z | python | 2020-10-23T18:19:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,557 | ["airflow/cli/cli_parser.py", "airflow/cli/commands/celery_command.py", "tests/cli/commands/test_celery_command.py"] | Can't pass --without-mingle and --without-gossip to celery worker | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: ANY
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
I've an issue with airflow celery worker performance, and I think there are a place for optimization if airflow enables providing --without-gossip while creating celery worker https://github.com/apache/airflow/blob/87038ae42aff3ff81cba1111e418a1f411a0c7b1/airflow/bin/cli.py#L1271
https://docs.celeryproject.org/en/4.4.3/reference/celery.bin.worker.html?highlight=--without-gossip#cmdoption-celery-worker-without-gossip. In my case, I've a 200 vm and every vm running 2 queues, so 400 **workers exchanging useless message every second** which result in airflow worker is taking about 10% of cpu time (in 4 cores) and to compare, I've a standalone celery app that consume 0.1% of cpu time **after enabling --without-gossip --without-mingle --without-heartbeat**.
**What you expected to happen**:
celery worker should not take more than 2% of cpu time.
**How to reproduce it**:
Using 200 vm and every vm 2 celery workers running at same time, you will see that the celery worker process is taking ~10-15% of cpu time.
PLEASE NOTE:
Due to https://github.com/celery/celery/issues/2566, we can't provide --without-mingle and --without-gossip through configuration (config file).
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/11557 | https://github.com/apache/airflow/pull/13880 | 390015d190fc7f6267f7a7c047b6947c5009b5a3 | b5aac82e1d6b4fb69bfc80dc867aa1ce193f03ed | 2020-10-15T18:11:50Z | python | 2021-03-30T21:13:17Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,547 | ["airflow/providers/google/cloud/transfers/postgres_to_gcs.py", "tests/providers/google/cloud/transfers/test_postgres_to_gcs.py"] | PostgresToGoogleCloudStorageOperator - Custom schema mapping | Version : 1.10.12
I used PostgresToGoogleCloudStorageOperator to export the data and the schema file as well. But I saw a column on Postgres was `TIMESTAMP without time zone` but in BigQuery the auto-create table (via `GoogleCloudStorageToBigQueryOperator`) used the JSON schema file and created the table. When I checked the BQ table the data type was `TIMESTAMP`.
For without timezone data, **`DATETIME`** would be the right choice. So can we manually MAP the data types during the schema file export? | https://github.com/apache/airflow/issues/11547 | https://github.com/apache/airflow/pull/22536 | 0c30564992a9250e294106e809123f0d5b1c2b78 | 388723950de9ca519108e0a8f6818f0fc0dd91d4 | 2020-10-15T14:09:15Z | python | 2022-03-27T09:10:52Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,543 | ["airflow/jobs/scheduler_job.py", "airflow/models/dagbag.py", "tests/jobs/test_scheduler_job.py", "tests/models/test_dagbag.py"] | Scheduler Deadlock in tests for MySQL 5.7 | We have the first example of scheduler deadlock in the willd for MySQL for the new scheduler:
https://github.com/apache/airflow/pull/8550/checks?check_run_id=1258193958#step:6:1439
Would be great to track it down and fix before the prod release:
**NOTE** For now the issue has been mitigated (according to the documentation of MySQL) by applying restarts (with exponential back-off) in #12046. This should help with beta-testing of Airlfow by the users and to see if the retries on deadlock has some potential negative effects (according to the documentation - they might have if they happen frequently enough).
Some more analysis might be performed between now and 2.0.0 which might result in deciding that this issue does not need any more fixes than the current mitigation or some actual changes in the code that might result in preventing the deadlocks.
| https://github.com/apache/airflow/issues/11543 | https://github.com/apache/airflow/pull/12046 | a1a1fc9f32940a8abbfc4a12d32321d75ac8268c | 2192010ee3acda045e266981b8d5e58e6ec6fb13 | 2020-10-15T10:25:14Z | python | 2020-11-02T18:15:17Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,513 | ["airflow/www/utils.py"] | WebUI: Action on selection in task instance list yields an error | **Apache Airflow version**: v2.0.0.dev0 (latest master)
**Environment**:
- **OS**: Ubuntu 18.04.4 LTS
- **Others**: Python 3.6.9
**What happened**:
Selecting a task in the the **task instance list** (*http:localhost:8080/taskinstance/list/*) and **performing an Action** on it (e.g. *Set state to 'failed'*) yields an error.
Error message:
```
Something bad has happened.
Please consider letting us know by creating a bug report using Github.
Python version: 3.6.12
Airflow version: 2.0.0.dev0
Node: emma
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/home/airflow/.local/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/views.py", line 686, in action_post
self.datamodel.get(self._deserialize_pk_if_composite(pk)) for pk in pks
File "/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/views.py", line 686, in <listcomp>
self.datamodel.get(self._deserialize_pk_if_composite(pk)) for pk in pks
File "/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/models/sqla/interface.py", line 870, in get
query, _filters, select_columns=select_columns
File "/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/models/sqla/interface.py", line 324, in apply_all
select_columns,
File "/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/models/sqla/interface.py", line 272, in _apply_inner_all
query = self.apply_filters(query, inner_filters)
File "/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/models/sqla/interface.py", line 162, in apply_filters
return filters.apply_all(query)
File "/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/models/filters.py", line 295, in apply_all
query = flt.apply(query, value)
File "/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/models/sqla/filters.py", line 137, in apply
query, field = get_field_setup_query(query, self.model, self.column_name)
File "/home/airflow/.local/lib/python3.6/site-packages/flask_appbuilder/models/sqla/filters.py", line 40, in get_field_setup_query
if not hasattr(model, column_name):
TypeError: hasattr(): attribute name must be string
```
**How to reproduce it**:
I wanted to take an Action on a task instance of a DAG with `schedule_interval=None`. I am attaching a minimal DAG file used for reproducing this error.
<details>
<summary>DAG file</summary>
```
from airflow import DAG
from datetime import timedelta, datetime
from airflow.operators.bash_operator import BashOperator
dag = DAG(
'simple_dag',
default_args= {
'owner': 'airflow',
'depends_on_past': False,
'retries' : 0,
'start_date': datetime(1970, 1, 1),
'retry_delay': timedelta(seconds=30),
},
description='',
schedule_interval=None,
catchup=False,
)
t1 = BashOperator(
task_id='task1',
bash_command='echo 1',
dag=dag
)
```
</details>
**Anything else we need to know**:
Taking the same Action on the DagRun list *(http://localhost:8080/dagrun/list)* works.
Great project btw 🙌. Really enjoying using it. | https://github.com/apache/airflow/issues/11513 | https://github.com/apache/airflow/pull/11753 | f603b36aa4a07bf98ebe3b1c81676748173b8b57 | 28229e990894531d0aaa3f29fe68682c8b01430a | 2020-10-13T18:56:46Z | python | 2020-10-23T10:05:25Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,506 | ["airflow/api_connexion/openapi/v1.yaml", "tests/api_connexion/endpoints/test_dag_endpoint.py"] | REST API: DAGs schema does not allow "None" for schedule_interval | **Apache Airflow version**: v2.0.0.dev0 (latest master)
**What happened**:
Running Breeze on my local machine with sample DAGs.
**Request made with Insomnia desktop client** (same result using a web application client):
> GET /api/v1/dags?limit=100 HTTP/1.1
> Host: 127.0.0.1:28080
> Authorization: Basic YWRtaW46YWRtaW4=
> User-Agent: insomnia/2020.4.1
> Cookie: session=.eJw1zjkKwzAQQNG7qE6hGS1j-TJmNpEUMUG2q5C7RxDSffjNe4etDz_uYT3H5bewPSysoSJEVWo9UW4sGNFN-iwVMozdXdViEa1VGjp2BUCuzA1JFmNyoCypYMXezBM4gs1ZkJVAeok1ZaSoZiQ1c5RlUbGUihdWljAhLx9P3n0__7Tr8PHjQfh8Ac-rNx4.X4WlQA.kuA0c8M57RIBSTP0iyebJAYkY8M
> Accept: */*
**500 error response:**
```json
{
"detail": "None is not valid under any of the given schemas\n\nFailed validating 'oneOf' in schema['allOf'][0]['properties']['dags']['items']['properties']['schedule_interval']:\n {'description': 'Schedule interval. Defines how often DAG runs, this '\n \"object gets added to your latest task instance's\\n\"\n 'execution_date to figure out the next schedule.\\n',\n 'discriminator': {'propertyName': '__type'},\n 'oneOf': [{'description': 'Time delta',\n 'properties': {'__type': {'type': 'string'},\n 'days': {'type': 'integer'},\n 'microseconds': {'type': 'integer'},\n 'seconds': {'type': 'integer'}},\n 'required': ['__type', 'days', 'seconds', 'microseconds'],\n 'type': 'object',\n 'x-scope': ['',\n '#/components/schemas/DAGCollection',\n '#/components/schemas/DAG',\n '#/components/schemas/ScheduleInterval']},\n {'description': 'Relative delta',\n 'properties': {'__type': {'type': 'string'},\n 'day': {'type': 'integer'},\n 'days': {'type': 'integer'},\n 'hour': {'type': 'integer'},\n 'hours': {'type': 'integer'},\n 'leapdays': {'type': 'integer'},\n 'microsecond': {'type': 'integer'},\n 'microseconds': {'type': 'integer'},\n 'minute': {'type': 'integer'},\n 'minutes': {'type': 'integer'},\n 'month': {'type': 'integer'},\n 'months': {'type': 'integer'},\n 'second': {'type': 'integer'},\n 'seconds': {'type': 'integer'},\n 'year': {'type': 'integer'},\n 'years': {'type': 'integer'}},\n 'required': ['__type',\n 'years',\n 'months',\n 'days',\n 'leapdays',\n 'hours',\n 'minutes',\n 'seconds',\n 'microseconds',\n 'year',\n 'month',\n 'day',\n 'hour',\n 'minute',\n 'second',\n 'microsecond'],\n 'type': 'object',\n 'x-scope': ['',\n '#/components/schemas/DAGCollection',\n '#/components/schemas/DAG',\n '#/components/schemas/ScheduleInterval']},\n {'description': 'Cron expression',\n 'properties': {'__type': {'type': 'string'},\n 'value': {'type': 'string'}},\n 'required': ['__type', 'value'],\n 'type': 'object',\n 'x-scope': ['',\n '#/components/schemas/DAGCollection',\n '#/components/schemas/DAG',\n '#/components/schemas/ScheduleInterval']}],\n 'readOnly': True,\n 'x-scope': ['',\n '#/components/schemas/DAGCollection',\n '#/components/schemas/DAG']}\n\nOn instance['dags'][3]['schedule_interval']:\n None",
"status": 500,
"title": "Response body does not conform to specification",
"type": "https://airflow.readthedocs.io/en/latest/stable-rest-api-ref.html#section/Errors/Unknown"
}
```
**What you expected to happen**:
I expected to not receive error response and instead receive an array of DAG objects.
E.G.
```json
{
"dags": [
{
"dag_id": "example_bash_operator",
"description": null,
"fileloc": "/opt/airflow/airflow/example_dags/example_bash_operator.py",
"is_paused": false,
"is_subdag": false,
"owners": [
"airflow"
],
"root_dag_id": null,
"schedule_interval": {
"__type": "CronExpression",
"value": "0 0 * * *"
},
"tags": [
{
"name": "example"
}
]
},
{
"dag_id": "example_branch_dop_operator_v3",
"description": null,
"fileloc": "/opt/airflow/airflow/example_dags/example_branch_python_dop_operator_3.py",
"is_paused": true,
"is_subdag": false,
"owners": [
"airflow"
],
"root_dag_id": null,
"schedule_interval": {
"__type": "CronExpression",
"value": "*/1 * * * *"
},
"tags": [
{
"name": "example"
}
]
}
],
"total_entries": 26
}
```
**Anything else we need to know**:
Deleting the sample DAGs with a schedule of "None" or reducing the query limit to a smaller value (that omits any DAGs w/ "None" schedule) prevents the error response from occurring.
| https://github.com/apache/airflow/issues/11506 | https://github.com/apache/airflow/pull/11532 | 86ed7dcd687627412e93620ec5da7f05e2552ae7 | 7206fd7d542450c6239947ba007a91534f0a57b9 | 2020-10-13T15:19:02Z | python | 2020-10-19T10:03:17Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,500 | ["airflow/www/templates/airflow/dag.html"] | Selecting DAG ID also selects the scheduler interval | Selecting DAG ID also selects the scheduler interval, check the GIF below

| https://github.com/apache/airflow/issues/11500 | https://github.com/apache/airflow/pull/11503 | 760cd1409c5f2116a6b16edab87c134ad27e5702 | f43d8559fec91e473aa4f67ea262325462de0b5f | 2020-10-13T13:40:22Z | python | 2020-10-13T15:05:38Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,493 | ["airflow/migrations/versions/52d53670a240_fix_mssql_exec_date_rendered_task_instance.py"] | MSSQL backend broken when DAG serialization enabled | Version is 1.10.12
Back-end is MSSQL 12
The following SQL query is broken:
result = session.query(cls.rendered_fields).filter(
cls.dag_id == ti.dag_id,
cls.task_id == ti.task_id,
cls.execution_date == ti.execution_date
).one_or_none()
This query fails as the rendered_task_instance_fields table in the database stores execution date as TIMESTAMP but other tables have updated execution date for MSSQL to DATETIME2
Here's the stack trace:
[2020-10-12 14:23:49,025] {taskinstance.py:1150} ERROR - (pyodbc.DataError) ('22018', '[22018] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Operand type clash: datetime2 is incompatible with timestamp (206) (SQLExecDirectW)')
[SQL: SELECT rendered_task_instance_fields.dag_id AS rendered_task_instance_fields_dag_id, rendered_task_instance_fields.task_id AS rendered_task_instance_fields_task_id, rendered_task_instance_fields.execution_date AS rendered_task_instance_fields_execution_date, rendered_task_instance_fields.rendered_fields AS rendered_task_instance_fields_rendered_fields
FROM rendered_task_instance_fields
WHERE rendered_task_instance_fields.dag_id = ? AND rendered_task_instance_fields.task_id = ? AND rendered_task_instance_fields.execution_date = ?]
[parameters: ('xxx-20200901', 'DATA_FEED_AVAILABLE_CHECK', <Pendulum [2020-10-09T00:00:00+00:00]>)]
(Background on this error at: http://sqlalche.me/e/13/9h9h)
Traceback (most recent call last):
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/engine/default.py", line 596, in do_execute
cursor.execute(statement, parameters)
pyodbc.DataError: ('22018', '[22018] [Microsoft][ODBC Driver 17 for SQL Server][SQL Server]Operand type clash: datetime2 is incompatible with timestamp (206) (SQLExecDirectW)')
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 967, in _run_raw_task
RTIF.write(RTIF(ti=self, render_templates=False), session=session)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/airflow/utils/db.py", line 70, in wrapper
return func(*args, **kwargs)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/airflow/models/renderedtifields.py", line 94, in write
session.merge(self)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2155, in merge
_resolve_conflict_map=_resolve_conflict_map,
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2228, in _merge
merged = self.query(mapper.class_).get(key[1])
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 1018, in get
return self._get_impl(ident, loading.load_on_pk_identity)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 1135, in _get_impl
return db_load_fn(self, primary_key_identity)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/orm/loading.py", line 286, in load_on_pk_identity
return q.one()
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3463, in one
ret = self.one_or_none()
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3432, in one_or_none
ret = list(self)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3508, in __iter__
return self._execute_and_instances(context)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/orm/query.py", line 3533, in _execute_and_instances
result = conn.execute(querycontext.statement, self._params)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1011, in execute
return meth(self, multiparams, params)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/sql/elements.py", line 298, in _execute_on_connection
return connection._execute_clauseelement(self, multiparams, params)
File "/local/scratch/khkaas/conda/envs/airflow-final/lib/python3.7/site-packages/sqlalchemy/engine/base.py", line 1130, in _execute_clauseelement
distilled_params, | https://github.com/apache/airflow/issues/11493 | https://github.com/apache/airflow/pull/11512 | 3447b55ba57a06c3820d1f754835e7d7f9a1fc68 | 03a632e0adc320541b5fccf66739ab465b229fcd | 2020-10-13T10:32:27Z | python | 2020-10-15T12:33:30Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,479 | ["provider_packages/refactor_provider_packages.py", "scripts/ci/provider_packages/ci_prepare_provider_readme.sh"] | Elasticsearch Backport Provider Incompatible with Airflow 1.10.12 | **Apache Airflow version**: 1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.16.9
**Environment**:
- **Cloud provider or hardware configuration**: AWS
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**: Docker image running in k8s Pods
- **Others**: Rancher-provisioned k8s clusters
**What happened**:
After configuring the latest version of the Elasticsearch backport provider as my log handler via `config/airflow_local_settings.py` resulted in an error on the webserver when trying to read logs from Elasticsearch
```
[2020-10-12 21:02:00,487] {app.py:1892} ERROR - Exception on /get_logs_with_metadata [GET]
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.7/site-packages/airflow/www_rbac/decorators.py", line 121, in wrapper
return f(self, *args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/flask_appbuilder/security/decorators.py", line 109, in wraps
return f(self, *args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/www_rbac/decorators.py", line 56, in wrapper
return f(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py", line 74, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/airflow/www_rbac/views.py", line 733, in get_logs_with_metadata
logs, metadata = _get_logs_with_metadata(try_number, metadata)
File "/usr/local/lib/python3.7/site-packages/airflow/www_rbac/views.py", line 724, in _get_logs_with_metadata
logs, metadatas = handler.read(ti, try_number, metadata=metadata)
File "/usr/local/lib/python3.7/site-packages/airflow/utils/log/file_task_handler.py", line 194, in read
logs[i] += log
TypeError: can only concatenate str (not "list") to str
```
Here is the relevant section of my customized `airflow_local_settings.py` file with the updated Elasticsearch handler from the backport provider:
```
...
elif ELASTICSEARCH_HOST:
ELASTICSEARCH_LOG_ID_TEMPLATE: str = conf.get('elasticsearch', 'LOG_ID_TEMPLATE')
ELASTICSEARCH_END_OF_LOG_MARK: str = conf.get('elasticsearch', 'END_OF_LOG_MARK')
ELASTICSEARCH_FRONTEND: str = conf.get('elasticsearch', 'frontend')
ELASTICSEARCH_WRITE_STDOUT: bool = conf.getboolean('elasticsearch', 'WRITE_STDOUT')
ELASTICSEARCH_JSON_FORMAT: bool = conf.getboolean('elasticsearch', 'JSON_FORMAT')
ELASTICSEARCH_JSON_FIELDS: str = conf.get('elasticsearch', 'JSON_FIELDS')
ELASTIC_REMOTE_HANDLERS: Dict[str, Dict[str, Union[str, bool]]] = {
'task': {
'class': 'airflow.providers.elasticsearch.log.es_task_handler.ElasticsearchTaskHandler',
'formatter': 'airflow',
'base_log_folder': str(os.path.expanduser(BASE_LOG_FOLDER)),
'log_id_template': ELASTICSEARCH_LOG_ID_TEMPLATE,
'filename_template': FILENAME_TEMPLATE,
'end_of_log_mark': ELASTICSEARCH_END_OF_LOG_MARK,
'host': ELASTICSEARCH_HOST,
'frontend': ELASTICSEARCH_FRONTEND,
'write_stdout': ELASTICSEARCH_WRITE_STDOUT,
'json_format': ELASTICSEARCH_JSON_FORMAT,
'json_fields': ELASTICSEARCH_JSON_FIELDS
},
}
LOGGING_CONFIG['handlers'].update(ELASTIC_REMOTE_HANDLERS)
...
```
**What you expected to happen**:
Airflow's web UI properly displays the logs from Elasticsearch
**How to reproduce it**:
Configure custom logging via `config/airflow_local_settings.py` to `airflow.providers.elasticsearch.log.es_task_handler.ElasticsearchTaskHandler` and set the `logging_config_class` in `airflow.cfg`
When a task has been run, try to view its logs in the web UI and check the webserver logs to see the error above
| https://github.com/apache/airflow/issues/11479 | https://github.com/apache/airflow/pull/11509 | 8dcc744631d3ae921e439360e69f23112bcbdaf4 | cb070e928b73368898a79a6cd41517e680dc3834 | 2020-10-12T21:05:56Z | python | 2020-11-05T22:20:36Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,464 | ["CONTRIBUTING.rst", "INSTALL", "scripts/ci/pre_commit/pre_commit_check_order_setup.py", "scripts/in_container/run_prepare_provider_packages.sh", "setup.py"] | Make extras install corresponding provider packages. | When you install Aiflow with an extra, the required provider packages should be installed automatically.
For example:
`pip instal apache-airflow[google]` should install apache-airlfow-providers-google package. | https://github.com/apache/airflow/issues/11464 | https://github.com/apache/airflow/pull/11526 | a7272f47f67bb607198731fa69efe3f42f256ce0 | ea27f90d299b9585e3d59c2ce4c98054545b34cc | 2020-10-12T11:28:26Z | python | 2020-11-09T22:01:19Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,463 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/dag_processing/manager.py", "airflow/jobs/scheduler_job.py", "airflow/models/dagrun.py", "airflow/stats.py", "tests/core/test_stats.py", "tests/dag_processing/test_manager.py", "tests/jobs/test_scheduler_job.py", "tests/models/test_dagrun.py"] | Tags rather than names in variable parts of the metrics | **Description**
It would be great if the metrics we publish in Airflow change to include TAGs for "variable" parts of those (such as dag_id, task_id, pool name) rather than being part of the metrics.
See here: https://airflow.apache.org/docs/1.10.12/metrics.html, https://airflow.readthedocs.io/en/latest/logging-monitoring/metrics.html - those variable parts should be tags.
We might consider to change those metrics by default in 2.0 and possibly introduce a backport in 1.10.13 to allow people to migrate.
**Use case / motivation**
Having the variable parts of the metrics makes it really difficult to aggregate the metrics. Seems that statsd python library does not support tags, but there are possible solutions by installing extensions:
https://stackoverflow.com/questions/49852654/sending-statsd-metrics-with-tags-from-python-client
**Related Issues**
<!-- Is there currently another issue associated with this? -->
| https://github.com/apache/airflow/issues/11463 | https://github.com/apache/airflow/pull/29093 | 81f07274b9cd9369a1024eb8b0ad5ee6058202f0 | 289ae47f43674ae10b6a9948665a59274826e2a5 | 2020-10-12T10:09:16Z | python | 2023-02-14T21:35:50Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,443 | ["tests/sensors/test_external_task_sensor.py"] | [Quarantine] test_clear_multiple_external_task_marker timing out | It would be great to fix it before 2.0
Example failure here: https://github.com/apache/airflow/runs/1239113557?check_suite_focus=true#step:9:742
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/11443 | https://github.com/apache/airflow/pull/21343 | 1884f2227d1e41d7bb37246ece4da5d871036c1f | 0873ee7e847e67cf045d9fcc3da6f6422b1b7701 | 2020-10-11T21:08:03Z | python | 2022-02-15T21:21:58Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,442 | [".github/workflows/build-images-workflow-run.yml", ".github/workflows/ci.yml", "scripts/ci/tools/ci_free_space_on_ci.sh", "tests/cli/commands/test_jobs_command.py", "tests/jobs/test_scheduler_job.py", "tests/models/test_taskinstance.py", "tests/test_utils/asserts.py"] | [QUARANTINE] TestSchedulerJob.test_retry_handling_job failing | I believe this is just a test failing that was in quarantine @ashb - it fails consistently and the error seems to be release to the HA scheduler changes.
E AttributeError: 'NoneType' object has no attribute 'try_number'
Example failure: https://github.com/apache/airflow/runs/1239113557?check_suite_focus=true#step:9:646
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/11442 | https://github.com/apache/airflow/pull/14792 | 3f61df11e7e81abc0ac4495325ccb55cc1c88af4 | 45cf89ce51b203bdf4a2545c67449b67ac5e94f1 | 2020-10-11T21:04:36Z | python | 2021-03-18T13:01:10Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,441 | [".github/workflows/build-images-workflow-run.yml", ".github/workflows/ci.yml", "scripts/ci/tools/ci_free_space_on_ci.sh", "tests/cli/commands/test_jobs_command.py", "tests/jobs/test_scheduler_job.py", "tests/models/test_taskinstance.py", "tests/test_utils/asserts.py"] | [QUARANTINE] TestSchedulerJob.test_change_state_for_tis_without_dagrun failures | Seems that the TestSchedulerJob.test_change_state_for_tis_without_dagrun is failing often.
Example here: https://github.com/apache/airflow/runs/1239113557?check_suite_focus=true#step:9:444
Would be great to get it resolved before 2.0
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/11441 | https://github.com/apache/airflow/pull/14792 | 3f61df11e7e81abc0ac4495325ccb55cc1c88af4 | 45cf89ce51b203bdf4a2545c67449b67ac5e94f1 | 2020-10-11T21:00:41Z | python | 2021-03-18T13:01:10Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,431 | ["airflow/cli/cli_parser.py", "airflow/cli/commands/provider_command.py", "airflow/provider.yaml.schema.json", "airflow/providers/google/provider.yaml", "airflow/providers/qubole/provider.yaml", "airflow/providers_manager.py", "airflow/serialization/serialized_objects.py", "scripts/in_container/run_install_and_test_provider_packages.sh", "tests/core/test_providers_manager.py"] | Implement Extra Links support for Provider packages. | Extra links supported for operators coming from third-party packages:
airflow.serialization.serialized_objects.BUILTIN_OPERATOR_EXTRA_LINKS
| https://github.com/apache/airflow/issues/11431 | https://github.com/apache/airflow/pull/12472 | 6150e265a01f3fb8bb498bf3933a2c36a9774cf6 | 1dcd3e13fd0a078fc9440e91b77f6f87aa60dd3b | 2020-10-11T18:26:28Z | python | 2020-12-05T15:24:38Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,423 | ["dev/README_RELEASE_AIRFLOW.md", "dev/README_RELEASE_PROVIDER_PACKAGES.md", "docs/build_docs.py", "docs/exts/airflow_intersphinx.py", "docs/exts/docs_build/code_utils.py", "docs/exts/docs_build/docs_builder.py", "docs/exts/docs_build/github_action_utils.py", "docs/publish_docs.py"] | Separate out documentation building and publishing per provider | https://github.com/apache/airflow/issues/11423 | https://github.com/apache/airflow/pull/12892 | 73843d05f753b9fb8d6d3928e5c5695ad955e15a | e595d35bf4b57865f930938df12a673c3792e35e | 2020-10-11T17:50:21Z | python | 2020-12-09T00:03:22Z |
|
closed | apache/airflow | https://github.com/apache/airflow | 11,394 | ["airflow/settings.py", "airflow/www/app.py", "tests/www/test_app.py"] | SQLAlchemy engine configuration is not passed to FAB based UI | SQLAlchemy engine configuration is not passed to FAB based UI. Faced this issue when running Airflow with MySQL metadata store with wait_timeout = 120. Webserver is failing with Internal Server Error due to "[MySQL Server has gone away](https://docs.sqlalchemy.org/en/13/faq/connections.html#mysql-server-has-gone-away)" error. Settings like `sql_alchemy_pool_recycle` and `sql_alchemy_pool_pre_ping` have no impact on FAB internal communication with MySQL.
**Apache Airflow version**: 1.10.12 | https://github.com/apache/airflow/issues/11394 | https://github.com/apache/airflow/pull/11395 | 0823d46a7f267f2e45195a175021825367938add | 91484b938f0b6f943404f1aeb3e63b61b808cfe9 | 2020-10-10T10:20:33Z | python | 2020-10-16T17:55:41Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,393 | ["airflow/models/baseoperator.py", "airflow/models/dagrun.py", "airflow/operators/dummy_operator.py", "airflow/serialization/schema.json", "airflow/serialization/serialized_objects.py", "tests/dags/test_only_dummy_tasks.py", "tests/serialization/test_dag_serialization.py"] | Optimize subclasses of DummyOperator be set straight to success instead of being run | **Apache Airflow version**:
v2.0.0
**What happened**:
Custom operators inheriting from DummyOperator are go in to a scheduled state, where as DummyOperator itself is set straight to success.
**What you expected to happen**:
All operators should be successfully executed.
**How to reproduce it**:
```py
from datetime import datetime
from airflow import DAG
from airflow.operators.dummy_operator import DummyOperator
class MyJob(DummyOperator):
template_fields_renderers = {
"body": "json"
}
template_fields = ("body",)
def __init__(self, body, *args, **kwargs):
super().__init__(*args, **kwargs)
self.body = body
with DAG(
dag_id="test",
schedule_interval=None,
start_date=datetime(2020, 8, 13),
) as dag:
MyJob(task_id="aaaa", body={"aaa": "bbb"})
```
**Anything else we need to know**:
This is a known problem that we will fix, see: https://github.com/apache/airflow/pull/10956#discussion_r502492775 | https://github.com/apache/airflow/issues/11393 | https://github.com/apache/airflow/pull/12745 | dab783fcdcd6e18ee4d46c6daad0d43a0b075ada | 101da213c527d443fa03e80769c0dff68f0a9aeb | 2020-10-10T10:16:36Z | python | 2020-12-02T12:49:17Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,370 | ["airflow/providers/google/cloud/transfers/mssql_to_gcs.py", "tests/providers/google/cloud/transfers/test_mssql_to_gcs.py"] | MSSQLToGCSOperator fails: datetime is not JSON Serializable | **Apache Airflow version**: 1.10.12
**Environment**: Centos 7 Host
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu 20.04 Docker Container
- **Kernel** (e.g. `uname -a`): Linux c6c6e8230c17 3.10.0-1127.18.2.el7.x86_64 #1 SMP Sun Jul 26 15:27:06 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
- **Install tools**: Docker Compose
- **Others** ?
**What happened**:
MSSQLToGCSOperator fails due to datetime JSON Serialization issue. Converting datetime to string removes this issue.
**What you expected to happen**:
Data transfer without error.
**How to reproduce it**:
Ensure `dbo.Customers` table has a **timestamp**/**datetime** column:
```
export_customers = MsSqlToGoogleCloudStorageOperator(
task_id='export_customers',
sql='SELECT * FROM dbo.Customers;',
bucket='mssql-export',
filename='data/customers/export.json',
schema_filename='schemas/export.json',
mssql_conn_id='mssql_default',
google_cloud_storage_conn_id='google_cloud_default',
dag=dag
)
```
**Solution**:
A solution has already been submitted here:
https://gist.github.com/Tomme/af6908e10ed969039d83e3bde2443648#gistcomment-3242915 | https://github.com/apache/airflow/issues/11370 | https://github.com/apache/airflow/pull/22882 | 3f63e9d685c14e19ae3e089cee1fd858122f6109 | 03e1c9b1521fea46ad3c7e15690810e4548f52c9 | 2020-10-09T05:33:48Z | python | 2022-04-11T06:36:44Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,301 | ["airflow/cli/commands/celery_command.py", "setup.py", "tests/cli/commands/test_celery_command.py", "tests/executors/test_celery_executor.py"] | Celery 5.0 | Hello,,
The [new Celery 5.0](https://docs.celeryproject.org/en/stable/whatsnew-5.0.html) has been released and we may consider migrating to this version. I don't know if it has to happen now, if we should wait for version 5.1 or do it now. I am creating this ticket to track this version.
To process this issue, we should consider the consequences of this update, update the CLI calls, update API calls, and test these versions thoroughly.
@auvipy Can you tell a little more about this release? Is this release stable enough to update our project to these versions? What consequences could this update have?
| https://github.com/apache/airflow/issues/11301 | https://github.com/apache/airflow/pull/17397 | 41632e03b8caf71de308414c48e9cb211a083761 | a2fd67dc5e52b54def97ea9bb61c8ba3557179c6 | 2020-10-06T11:30:05Z | python | 2021-08-27T12:49:51Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,286 | ["chart/files/pod-template-file.yaml"] | Configs under `config` in values.yaml aren't applying to worker pods | **Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
```
Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.2", GitCommit:"f5743093fd1c663cb0cbc89748f730662345d44d", GitTreeState:"clean", BuildDate:"2020-09-16T21:51:49Z", GoVersion:"go1.15.2", Compiler:"gc", Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"18", GitVersion:"v1.18.0", GitCommit:"9e991415386e4cf155a24b1da15becaa390438d8", GitTreeState:"clean", BuildDate:"2020-03-25T14:50:46Z", GoVersion:"go1.13.8", Compiler:"gc", Platform:"linux/amd64"}
```
**Environment**:
- **Cloud provider or hardware configuration**: N/A
- **OS** (e.g. from /etc/os-release): minikube
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
The configs present under the `config:` section aren't applying to the worker pods. For example, I have the following values set-up.
```
config:
core:
dags_folder: '{{ include "airflow_dags" . }}'
load_examples: "True"
colored_console_log: "False"
executor: "{{ .Values.executor }}"
remote_log_conn_id: "s3_conn"
remote_logging: "True"
remote_base_log_folder: "s3://prakshal-test-bucket/"
```
The worker pods don't recognize the remote logging values. I have to either pass them again under ENV or add them to the docker image being used for the workers.
CC: @dimberman
| https://github.com/apache/airflow/issues/11286 | https://github.com/apache/airflow/pull/11480 | 52b4733b8297c8a08210aead18c661a9d58f3f6c | 3970bfad4c878d99adce80a4bfd824a15132a161 | 2020-10-05T16:55:58Z | python | 2020-10-19T21:22:19Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,269 | ["CONTRIBUTING.rst"] | Incorrect Redirect in documentation | This issues is related to a documentation issue in Contributing.rst. The static checks link is redirected wrongly to an error page.
Correction of this will help new users and Airflow enthusiasts to have a coherent and error free reading experience in the future. | https://github.com/apache/airflow/issues/11269 | https://github.com/apache/airflow/pull/11271 | a4478f5665688afb3e112357f55b90f9838f83ab | 1180f1ba4284cfa167b8e4dcb21f90f251812d68 | 2020-10-05T02:24:17Z | python | 2020-10-05T07:08:54Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,231 | ["airflow/providers/google/cloud/hooks/bigquery.py", "airflow/providers/google/cloud/operators/bigquery.py", "airflow/providers/google/cloud/transfers/gcs_to_bigquery.py"] | docs: update BigQuery clustering field docstrings | BigQuery supports clustering for both partitioned and non-partitioned tables.
Reference: [https://cloud.google.com/bigquery/docs/clustered-tables](https://cloud.google.com/bigquery/docs/clustered-tables) | https://github.com/apache/airflow/issues/11231 | https://github.com/apache/airflow/pull/11232 | 49c58147fed8a52869d0b0ecc00c102c11972ad0 | 46a121fb7b77c0964e053b58750e2d8bc2bd0b2a | 2020-10-02T08:13:41Z | python | 2020-10-18T16:13:28Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,175 | ["docs/img/airflow.gif"] | Update the GIF on the documentation main page | Hello,
The GIF on the main page of the documentation is slightly outdated. I think it's worth updating.
https://airflow.readthedocs.io/en/latest/

The interface has changed a lot so now this gif doesn't show the current state of the interface.
Best regards,
Kamil Breguła | https://github.com/apache/airflow/issues/11175 | https://github.com/apache/airflow/pull/12044 | 644791989eed2fe15dff6456cecc85e09068f4dd | e4c86a51a796ac0dfbab489ffa0f2f2cf932471e | 2020-09-27T15:39:02Z | python | 2020-11-02T15:48:42Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,164 | ["BREEZE.rst", "README.md", "breeze-complete", "scripts/ci/libraries/_initialization.sh", "scripts/ci/mysql/conf.d/airflow.cnf", "scripts/docker/install_mysql.sh", "tests/providers/mysql/hooks/test_mysql.py"] | Enable MySQL 8 CI jobs | **Description**
We've never run any CI jobs for MySQL 8 in our CI system. All the jobs are running at 5.7. With the recent refractor, it is as easy as changing CURRENT_MYSQL_VERSIONS to include the MySQL 8 and change the Dockerfile.ci to use mysql8 client as well (currently Dockerfile.ci uses MySQL 5.7 client).
I think this is a hard pre-requisite for the HA Scheduler PR (#9630) to be merged. HA scheduler will only work in a single-scheduler mode for MySQL 5.7 and without running the whole test suite for MySQL 8 we might be giving the users who would likely try MySQL 8 version of Airflow that does not really work well.
This has some consequences: our build will run quite a bit longer (we might limit versions of python to test with it but still this is at least 20% longer. So it would be best if we implement #10507 and pre-requisite for it is #11163.
Otherwise, our PRs will be blocking each other and we will slow down our own development.
Note that implementing it might be as easy as adding the version and changing client and it **just work** but since we did not ever run it, it is likely there will be some problems to solve.
**Use case / motivation**
When HA scheduler will get in hands of our users as first Alpha release, we have to be quite sure that it works also for MySQL 8 because only with MySQL 8 users will be able to test the HA setup for their MySQL installations.
**Related Issues**
#10507 #11163
| https://github.com/apache/airflow/issues/11164 | https://github.com/apache/airflow/pull/11247 | 1b9e59c31afcf3482aae8c0037ef7f41ff0cf31e | 6dce7a6c26bfaf991aa24482afd7fee1e263d6c9 | 2020-09-26T08:36:40Z | python | 2020-10-04T11:45:05Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,152 | ["dev/README_RELEASE_AIRFLOW.md", "dev/README_RELEASE_PROVIDER_PACKAGES.md", "docs/build_docs.py", "docs/exts/airflow_intersphinx.py", "docs/exts/docs_build/code_utils.py", "docs/exts/docs_build/docs_builder.py", "docs/exts/docs_build/github_action_utils.py", "docs/publish_docs.py"] | Documentation for our many products | Hi.
I have fantastic news. We are growing! 🐈 Together with this, our documentation grows and we have newer needs and requirements.
We currently only publish documentation for our app releases.
- https://airflow.apache.org/docs/stable/
But our community is also working on other products that have a different release cycle, use case, docs build tool and needs
- Production Docker image: https://hub.docker.com/r/apache/airflow
- REST API: https://airflow.readthedocs.io/en/latest/stable-rest-api-ref.html (it has a separate version number, so a version index could be helpful. We can also set the version to the application version)
- Backport/provider packages: https://github.com/apache/airflow/tree/master/backport_packages
- Helm Chart: https://github.com/apache/airflow/tree/master/chart
- [API Clients for Apache Airflow](https://github.com/apache/airflow/issues/10176)
- Go: https://github.com/apache/airflow-client-go
- Python: https://github.com/apache/airflow-client-python
- Java: https://github.com/apache/airflow-client-java
- Javascript: https://github.com/apache/airflow-client-javascript
- Apache Airflow on K8S Operator: https://github.com/apache/airflow-on-k8s-operator
and other: https://github.com/apache/airflow/issues/10550, https://github.com/apache/airflow/issues/10552
I think it is worth considering how we want to inform users about these products on our website.
One solution is to create an index at: `https://airflow.apache.org/docs/` (currently there is a redirection to the latest stable version of the documentation), which will refer you to separate documentation for each product.
It will look similar to the examples:
https://mc-stan.org/
https://cloud.google.com/sql/docs/
https://www.terraform.io/docs/
An alternative solution is moving all documentations to our app documentation:
https://github.com/apache/airflow/pull/11136
https://github.com/apache/airflow/pull/10998
What is our plan for this documentation? Do we need to standardize our documentation building tools? What requirements does each documentation have, e.g. do we always need versioning?
CC: @kaxil @potiuk
| https://github.com/apache/airflow/issues/11152 | https://github.com/apache/airflow/pull/12892 | 73843d05f753b9fb8d6d3928e5c5695ad955e15a | e595d35bf4b57865f930938df12a673c3792e35e | 2020-09-25T17:34:39Z | python | 2020-12-09T00:03:22Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,146 | ["airflow/cli/cli_parser.py", "airflow/cli/commands/kubernetes_command.py", "chart/templates/cleanup/cleanup-cronjob.yaml", "tests/cli/commands/test_kubernetes_command.py"] | Cleanup CronJob fails | **Apache Airflow version**: 1.10.12
**Kubernetes version** : 1.17.9
**Environment**:
- **Cloud provider or hardware configuration**: Azure Kubernetes Service
- **OS** (e.g. from /etc/os-release): Linux
- **Install tools**: Helm chart (/chart in repo)
- **Others**:
**What happened**:
Pod cannot run since `airflow-cleanup-pods` is not in $PATH
```
Error: failed to start container "airflow-cleanup-pods": Error response from daemon: OCI runtime create failed: container_linux.go:349: starting container process caused "exec: \"airflow-cleanup-pods\": executable file not found in $PATH": unknown
```
**What you expected to happen**:
Cleanup job pod runs
<!-- What do you think went wrong? -->
**How to reproduce it**:
* Deploy Helm Chart
* Manually trigger `airflow-cleanup ` CronJob
* See it cannot run since `airflow-cleanup-pods` is not in $PATH | https://github.com/apache/airflow/issues/11146 | https://github.com/apache/airflow/pull/11802 | 2ebe623312e16e47aa4fece2a216700fbfb2d168 | 980c7252c0f28c251e9f87d736cd88d6027f3da3 | 2020-09-25T09:59:17Z | python | 2020-11-03T15:28:51Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,145 | ["chart/dockerfiles/pgbouncer-exporter/build_and_push.sh"] | airflow-pgbouncer-exporter image : standard_init_linux.go:211: exec user process caused "exec format error" | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: I'm using the helm chart that is in master branch
**What happened**:
I'm having trouble with the airflow helm chart, when pgbouncer is deployed `pgbouncer.enabled=true` the pgbouncer deployment fails because one of the container is the pod is continuously crashing.
I've narrowed down the problem to:
docker run -it --rm apache/airflow:airflow-pgbouncer-exporter-2020.09.05-0.5.0
standard_init_linux.go:211: exec user process caused "exec format error"
**What you expected to happen**:
It should run the `pgbouncer_exporter server`.
<!-- What do you think went wrong? -->
**How to reproduce it**:
docker run -it --rm apache/airflow:airflow-pgbouncer-exporter-2020.09.05-0.5.0
standard_init_linux.go:211: exec user process caused "exec format error"
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/11145 | https://github.com/apache/airflow/pull/11148 | edf803374dc603d6b8ea960c9c56d66797d06ba9 | 33fe9a52cd6fff092f4de977415666d7f3c64d6e | 2020-09-25T09:20:41Z | python | 2020-09-25T15:26:44Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,128 | ["CONTRIBUTING.rst", "airflow/providers/amazon/aws/transfers/mysql_to_s3.py", "airflow/providers/amazon/aws/transfers/sql_to_s3.py", "airflow/providers/amazon/provider.yaml", "airflow/providers/dependencies.json", "dev/provider_packages/prepare_provider_packages.py", "tests/providers/amazon/aws/transfers/test_mysql_to_s3.py", "tests/providers/amazon/aws/transfers/test_sql_to_s3.py"] | SqlToS3Operator | Recently, all the Sql-related (mysql, postgres...) operators were merged into one. Applying the same, would be great to have a single SqlToS3Operator, so that you don't have to use MySqlToS3Operator, RedshiftToS3Operator, SnowflakeToS3Operator...
| https://github.com/apache/airflow/issues/11128 | https://github.com/apache/airflow/pull/20807 | 515ea84335fc440fe022db2a0e3b158e0d7702da | bad070f7f484a9b4065a0d86195a1e8002d9bfef | 2020-09-24T16:35:03Z | python | 2022-01-24T00:03:33Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,114 | [".github/workflows/ci.yml"] | Unexpected super long period CI tasks. | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
<!-- A short description of your feature -->
**Use case / motivation**
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
Hi, Airflow team
This issue is not related to the Airflow feature, but to the GitHub action control.
I am talking with the GitHub Action team, as I noticed, sometimes, the ASF GitHub action has very limited slots for Apache SkyWalking.
GitHub did some research, and at least found one task from the Airflow project,
https://github.com/apache/airflow/actions/runs/259069303
This task lasts over 6 hours until GitHub action platform terminated it. Could you help with rechecking why it last so long?
**Related Issues**
<!-- Is there currently another issue associated with this? -->
| https://github.com/apache/airflow/issues/11114 | https://github.com/apache/airflow/pull/11117 | bcdd3bb7bb0e73ec957fa4077b025eb5c1fef90d | e252a6064fc0a33e12894b1cdaa414e178226af4 | 2020-09-24T00:53:51Z | python | 2020-09-24T08:46:43Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,086 | ["airflow/jobs/local_task_job.py", "airflow/models/taskinstance.py", "tests/jobs/test_local_task_job.py", "tests/models/test_taskinstance.py"] | on_failure_callback not called when task receives termination signal | **Apache Airflow version**: 1.10.7, 1.10.10, 1.10.12
**Environment**:
- **Cloud provider or hardware configuration**: AWS EC2
- **OS** (e.g. from /etc/os-release): Linux
- **Kernel** (e.g. `uname -a`): Debian
- **Install tools**:
- **Others**:
**What happened**:
For the last several versions of Airflow, we've noticed that when a task receives a `SIGTERM` signal (currently represented as `Task exited with return code Negsignal.SIGKILL`, though previously represented as `Task exited with return code -9`), the failure email would be sent, but the `on_failure_callback` would not be called.
This happened fairly frequently in the past for us as we had tasks that would consume high amounts of memory and occasionally we would have too many running on the same worker and the tasks would be OOM killed. In these instances, we would receive failure emails with the contents `detected as zombie` and the `on_failure_callback` would **not** be called. We were hoping #7025 would resolve this with the most recent upgrade (and we've also taken steps to reduce our memory footprint), but we just had this happen again recently.
**What you expected to happen**:
If a tasks fails (even if the cause of the failure is a lack of resources), I would hope the `on_failure_callback` would still be called.
**How to reproduce it**:
Example DAG setup:
<details><summary>CODE</summary>
```python
# -*- coding: utf-8 -*-
"""
# POC: On Failure Callback for SIGKILL
"""
from datetime import datetime
import numpy as np
from airflow import DAG
from airflow.api.common.experimental.trigger_dag import trigger_dag
from airflow.operators.python_operator import PythonOperator
def on_failure_callback(**context):
print("===IN ON FAILURE CALLBACK===")
print("Triggering another run of the task")
trigger_dag("OOM_test_follower")
def high_memory_task():
l = []
iteration = 0
while True:
print(f"Iteration: {iteration}")
l.append(np.random.randint(1_000_000, size=(1000, 1000, 100)))
iteration += 1
def failure_task():
raise ValueError("whoops")
def print_context(**context):
print("This DAG was launched by the failure callback")
print(context)
dag = DAG(
dag_id="OOM_test",
schedule_interval=None,
catchup=False,
default_args={
"owner": "madison.bowden",
"start_date": datetime(year=2019, month=7, day=1),
"email": "your-email",
},
)
with dag:
PythonOperator(
task_id="oom_task",
python_callable=high_memory_task,
on_failure_callback=on_failure_callback,
)
failure_dag = DAG(
dag_id="Failure_test",
schedule_interval=None,
catchup=False,
default_args={
"owner": "madison.bowden",
"start_date": datetime(year=2019, month=7, day=1),
"email": "your-email",
},
)
with failure_dag:
PythonOperator(
task_id="failure_task",
python_callable=failure_task,
on_failure_callback=on_failure_callback,
)
dag_follower = DAG(
dag_id="OOM_test_follower",
schedule_interval=None,
catchup=False,
default_args={
"owner": "madison.bowden",
"start_date": datetime(year=2019, month=7, day=1),
"email": "your-email",
},
)
with dag_follower:
PythonOperator(
task_id="oom_task_failure", python_callable=print_context, provide_context=True
)
```
</details>
With the above example, the `Failure_test` should trigger a run of the `OOM_test_follower` DAG when it fails. The `OOM_test` DAG when triggered should quickly run out of memory and then **not** trigger a run of the `OOM_test_follower` DAG.
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/11086 | https://github.com/apache/airflow/pull/15537 | 13faa6912f7cd927737a1dc15630d3bbaf2f5d4d | 817b599234dca050438ee04bc6944d32bc032694 | 2020-09-22T18:08:55Z | python | 2021-05-05T20:02:25Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,076 | ["airflow/www/views.py", "tests/www/test_views.py"] | Add bulk "clear" to Browse -> DAG Runs UI | **Description**
Airflow DAG Run UI (Browse -> DAG Runs) is really useful for managing many DAG Runs, creating specific DAG Runs and so on.
It would be great to have an additional option to "With Selected DAG Runs" -> "Clear" to reset all task instances under those DAG Runs.
**Use case / motivation**
When rerunning DAGs especially during development cycles it is tedious to go into the Tree view and clear many DAG Runs manually. It would be great to have some way to do this in bulk / over a range of DAG Runs.
The DAG Run UI seems like a good place to add this since it is already a kind of administrative area over all the DAG Runs and has "with selected" options like "delete", "mark failed", etc. Discussion welcome :)
**Related Issues**
None
| https://github.com/apache/airflow/issues/11076 | https://github.com/apache/airflow/pull/11226 | 0a0e1af80038ef89974c3c8444461fe867945daa | e4125666b580f5fa14adf87eb4146eefc8446d88 | 2020-09-22T08:13:21Z | python | 2020-10-03T09:30:08Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,074 | ["airflow/api_connexion/endpoints/dag_run_endpoint.py", "tests/api_connexion/endpoints/test_dag_run_endpoint.py"] | Stable API dag run query should count total results after filtering | Context:
When querying dag runs in the stable API, it is possible to filter by date range and to receive paginated results. Pagination returns a subset of results, but a count of all available results is returned.
Expected behavior:
The total count should take place after filtering by date, and before paginating results.
Actual behavior:
The count is made prior to filtering by date. This means that the total number of accessible dags is returned, rather than the number of dags the user filtered for.
https://github.com/apache/airflow/pull/10594/files/1fe1da978763c9919cee0d2c7eccd96438631993#diff-3239f89a092764597348cabc5e74c2f5R125
I will make this change after #10594 is merged. | https://github.com/apache/airflow/issues/11074 | https://github.com/apache/airflow/pull/11832 | 872b1566a11cb73297e657ff325161721b296574 | c62a49af0793247a614c6f25da59908ab5ab5e30 | 2020-09-22T03:06:45Z | python | 2020-10-25T21:35:30Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,073 | ["tests/api_connexion/endpoints/test_xcom_endpoint.py"] | Test that the xcom API endpoint handles joins properly | The XCom endpoint joins the XCom and DagRun tables. Currently, the output of the join isn't getting stored as a new `query` variable, as seen in the link to the following PR. We should add a new test to check that these tables are getting joined correctly.
https://github.com/apache/airflow/pull/10594/files/1fe1da978763c9919cee0d2c7eccd96438631993#diff-aa0c2f68c6177fb1587a6de6b5cfb28cR59
I will make these changes after #10594 | https://github.com/apache/airflow/issues/11073 | https://github.com/apache/airflow/pull/11859 | 088b98e71fc512c8f19dbe71a82f2b48100e718c | 577a41c203abfb41fb1157163cbac19102f764f5 | 2020-09-22T02:58:40Z | python | 2020-11-02T20:53:48Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,054 | ["airflow/example_dags/example_params_trigger_ui.py", "airflow/example_dags/example_params_ui_tutorial.py", "airflow/www/static/js/main.js", "airflow/www/static/js/trigger.js", "airflow/www/templates/airflow/trigger.html", "airflow/www/views.py", "docs/apache-airflow/core-concepts/params.rst", "docs/apache-airflow/img/trigger-dag-tutorial-form.png", "tests/www/views/test_views_trigger_dag.py"] | Improvement: Replace 'Configuration JSON' free-text area with key-value form in Trigger DAG UI | **Description**
Replace the Configuration JSON text-area with a form that has individual input fields for each key & value. This will remove the need for the user to provide a json object. See hashicorp vault's key-value secrets UI for an example of what this might look like:

- If present these keys and values should be serialised as json client-side and posted to the trigger-dag API endpoint as the `conf` request-parameter.
- By default no conf request-parameter is sent. Perhaps a checkbox should be used to determine whether conf is sent at all?
- The user may want to add new keys, or remove keys & values that they have previously added.
- [PR-10839](https://github.com/apache/airflow/pull/10839) is pre-populating the form with the DAG's default params. The new form should be pre-populated with the keys & values from `DAG.params`
**Use case / motivation**
Currently when triggering a manual DAG run with user-provided runtime parameters, the user has to provide a correctly formatted JSON object. Filling in a form with keys & values will be a significantly better user experience, and will be less error prone & laborious than handling json in a free-text field.
This builds on the great work already done on enhancing the Trigger-DAG functionality
**Related Issues**
Originating PR: https://github.com/apache/airflow/pull/10839 | https://github.com/apache/airflow/issues/11054 | https://github.com/apache/airflow/pull/27063 | b141fbcb08ff59425769f013173c695298981b9f | 5e470c1fc422a05815a595cdf2544ebc586405cc | 2020-09-21T11:15:57Z | python | 2023-01-30T16:23:57Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,025 | ["airflow/www/templates/airflow/circles.html", "airflow/www/templates/airflow/graph.html", "airflow/www/templates/airflow/tree.html"] | Airflow Logo hides half of the page | Looks like a side-effect of https://github.com/apache/airflow/pull/11018
Airflow Logo almost takes up half of the page on current master:

cc @ryanahamilton
| https://github.com/apache/airflow/issues/11025 | https://github.com/apache/airflow/pull/11028 | 980aa39049b8dfd75de8b438c2f3cb0b407538e9 | d7f166606e3425bb3e1fc380623d428badcd159e | 2020-09-19T01:47:30Z | python | 2020-09-19T13:29:31Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,014 | ["airflow/www/templates/airflow/dag.html"] | RBAC UI: Editing dag runs gives forbidden page when admin | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.17
**Environment**: Running. on kubernetes
- **Cloud provider or hardware configuration**: Aws EKS on kubernetes
- **OS** (e.g. from /etc/os-release): NA
- **Kernel** (e.g. `uname -a`): NA
- **Install tools**: NA
- **Others**: NA
**What happened**:
<!-- (please include exact error messages if you can) -->
When editing a dag run as admin in the RBAC ui I always get a forbidden page.
**What you expected to happen**:
To be able to edit the dag run.
<!-- What do you think went wrong? -->
**How to reproduce it**:
Just enabled the RBAC UI and edit a dag run.
Example video:
https://youtu.be/nJPor3nUFrk
**Anything else we need to know**:
On the webserver the following log can be found:
```
[2020-09-18 14:42:15,160] {decorators.py:111} WARNING - Access is Denied for: can_edit on: DagModelView
```
| https://github.com/apache/airflow/issues/11014 | https://github.com/apache/airflow/pull/11026 | 9edfcb7ac46917836ec956264da8876e58d92392 | 78c342451c302dc2d2e796d025efedaf57ccb57a | 2020-09-18T14:46:48Z | python | 2020-09-19T08:43:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 11,011 | ["airflow/providers/amazon/CHANGELOG.rst", "airflow/providers/amazon/aws/hooks/datasync.py", "airflow/providers/amazon/aws/operators/datasync.py", "tests/providers/amazon/aws/operators/test_datasync.py"] | AWS DataSync Operator does not cancel task on Exception | **Apache Airflow version**: 1.10.8
**Environment**:
- **Cloud provider or hardware configuration**: 4 VCPU 8GB RAM VM
- **OS** (e.g. from /etc/os-release): RHEL 7.7
- **Kernel** (e.g. `uname -a`): Linux 3.10.0-957.el7.x86_64 #1 SMP Thu Oct 4 20:48:51 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
- **Install tools**:
- **Others**:
Note that the AWS DataSync Operator is not available in this version, we manually added it via Plugins.
**What happened**:
AWS DataSync service had a problem resulting in the Task Execution being stuck in LAUNCHING for a long period of time.
DataSync Operator encounted a timeout exception (not an Airflow Timeout Exception but one from token expiry of the underlying boto3 service).
This exception caused the operator to terminate but the Task Execution on AWS was still stuck in LAUNCHING
Other Airflow Datasync Operator tasks started to pile up in QUEUED status and eventually timed out, also leaving their Task Executions in QUEUED state in AWS, blocked by the LAUNCHING task execution.
**What you expected to happen**:
The DataSync operator should by default cancel a task execution which is in progress - if the operator terminates for any reason.
The AWS DataSync service can only run 1 DataSync task at a time (even when a task uses multiple DataSync agents). So there is a risk to all other DataSync tasks if one task gets stuck, then any tasks submitted in future will not run.
So the operator should catch exceptions from the wait_for_task_execution and cancel the task before re-raising the exception.
**How to reproduce it**:
Very difficult to reproduce without an AWS account and DataSync appliance, and the uncommon error conditions which cause a task to get irrecoverably stuck.
**Anything else we need to know**:
I authored the DataSync operator and have a working AWS Account to test in. This issue can be assigned to me. | https://github.com/apache/airflow/issues/11011 | https://github.com/apache/airflow/pull/16589 | 962c5f4397045fa34d07539cc95b2805fd5b55e5 | 2543c74c1927b751e7492df81d762e61d2a4d5f6 | 2020-09-18T11:22:59Z | python | 2021-06-24T12:59:27Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,988 | ["airflow/utils/file.py", "docs/concepts.rst", "tests/plugins/test_plugin_ignore.py"] | The Test_find_not_should_ignore_path is flaky | It seems that the `test_find_not_should_ignore_path` test has some dependency on side-effects from other tests.
For example it failed here:
https://github.com/apache/airflow/pull/10983/checks?check_run_id=1127304894#step:6:1043
```
> self.assertEqual(detected_files, should_not_ignore_files)
E AssertionError: Items in the second set but not the first:
E 'test_load.py'
E 'test_load_sub1.py'
```
And re-running it in a clean system works just fine in the same combination of python/db.
**What you expected to happen**:
The test should clean up /restore everything it needs before running or a test that makes side-effects is found and fixed.
For now I am moving the test to "heisentests with appropriate note.".
| https://github.com/apache/airflow/issues/10988 | https://github.com/apache/airflow/pull/11993 | adbf764ade6916b505c3238697bac10f98bfa6eb | 644791989eed2fe15dff6456cecc85e09068f4dd | 2020-09-17T10:12:11Z | python | 2020-11-02T14:35:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,974 | ["Dockerfile", "IMAGES.rst", "scripts/ci/libraries/_build_images.sh"] | GPL dependency error when trying to build image wiith breeze (airflow 1.10.2) | **Apache Airflow version**: 1.10.2
- **OS** (e.g. from /etc/os-release): Mac OS X | 10.15.6
**What happened**: I can't build image through using the following breeze commad
./breeze build-image --production-image --install-airflow-version="1.10.2"
Removing intermediate container f0ac3df80323
---> 0ac6851388bc
Step 50/65 : ENV AIRFLOW_INSTALL_VERSION=${AIRFLOW_INSTALL_VERSION}
---> Running in 057c99021846
Removing intermediate container 057c99021846
---> 55e48234a411
Step 51/65 : WORKDIR /opt/airflow
---> Running in 969b54f7df4c
Removing intermediate container 969b54f7df4c
---> 3fb9c6b83054
Step 52/65 : RUN pip install --user "${AIRFLOW_INSTALL_SOURCES}[${AIRFLOW_EXTRAS}]${AIRFLOW_INSTALL_VERSION}" --constraint "${AIRFLOW_CONSTRAINTS_URL}" && if [ -n "${ADDITIONAL_PYTHON_DEPS}" ]; then pip install --user ${ADDITIONAL_PYTHON_DEPS} --constraint "${AIRFLOW_CONSTRAINTS_URL}"; fi && find /root/.local/ -name '*.pyc' -print0 | xargs -0 rm -r && find /root/.local/ -type d -name '__pycache__' -print0 | xargs -0 rm -r
---> Running in 8891a38dcfed
pip install --user 'apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.2' --constraint https://raw.githubusercontent.com/apache/airflow/constraints-1.10.2/constraints-3.7.txt
Collecting apache-airflow[async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv]==1.10.2
Downloading apache-airflow-1.10.2.tar.gz (5.2 MB)
ERROR: Command errored out with exit status 1:
command: /usr/local/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-f0xtaqnq/apache-airflow/setup.py'"'"'; __file__='"'"'/tmp/pip-install-f0xtaqnq/apache-airflow/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-64dcdo1n
cwd: /tmp/pip-install-f0xtaqnq/apache-airflow/
Complete output (11 lines):
/tmp/pip-install-f0xtaqnq/apache-airflow/setup.py:23: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
import imp
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/tmp/pip-install-f0xtaqnq/apache-airflow/setup.py", line 429, in <module>
do_setup()
File "/tmp/pip-install-f0xtaqnq/apache-airflow/setup.py", line 287, in do_setup
verify_gpl_dependency()
File "/tmp/pip-install-f0xtaqnq/apache-airflow/setup.py", line 53, in verify_gpl_dependency
raise RuntimeError("By default one of Airflow's dependencies installs a GPL "
RuntimeError: By default one of Airflow's dependencies installs a GPL dependency (unidecode). To avoid this dependency set SLUGIFY_USES_TEXT_UNIDECODE=yes in your environment when you install or upgrade Airflow. To force installing the GPL version set AIRFLOW_GPL_UNIDECODE
----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
The command '/bin/bash -o pipefail -e -u -x -c pip install --user "${AIRFLOW_INSTALL_SOURCES}[${AIRFLOW_EXTRAS}]${AIRFLOW_INSTALL_VERSION}" --constraint "${AIRFLOW_CONSTRAINTS_URL}" && if [ -n "${ADDITIONAL_PYTHON_DEPS}" ]; then pip install --user ${ADDITIONAL_PYTHON_DEPS} --constraint "${AIRFLOW_CONSTRAINTS_URL}"; fi && find /root/.local/ -name '*.pyc' -print0 | xargs -0 rm -r && find /root/.local/ -type d -name '__pycache__' -print0 | xargs -0 rm -r' returned a non-zero code: 1
I don't understand this error, I ran the same command with another version of airflow (./breeze build-image --production-image --install-airflow-version="1.10.2") and it worked as well.
I have to use the version 1.10.2 and for me breeze is a good way to build env easily.
Best regards,
| https://github.com/apache/airflow/issues/10974 | https://github.com/apache/airflow/pull/10983 | 6e5cc4c6a2607a782810ee4650482c89d020eec6 | 4a46f4368b948403fee1c360aee802234ce35908 | 2020-09-16T14:41:55Z | python | 2020-09-17T12:25:34Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,954 | ["docs/apache-airflow/howto/set-up-database.rst"] | Serialized Dags with MySQL Latin1 encoded airflow db | **What Happened**
When enabling airflow serialized dags in airflow 1.10.12:
relevant airflor.cfg
```
store_serialized_dags = True
store_dag_code = True
# You can also update the following default configurations based on your needs
min_serialized_dag_update_interval = 30
min_serialized_dag_fetch_interval = 10
```
with a Mysql Server running 5.7.12, and the airflow database encoded in latin-1 (ugh bad mysql default).
I core dumps that are filling up my container.
Looking at the serialized.dag
```
==> scheduler-stderr.log <==
Fatal Python error: Cannot recover from stack overflow.
Current thread 0x00007f6565221400 (most recent call first):
File "/usr/local/lib/python3.6/site-packages/pendulum/pendulum.py", line 129 in __init__
File "/usr/local/lib/python3.6/site-packages/pendulum/pendulum.py", line 219 in instance
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 205 in _serialize
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 196 in <dictcomp>
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 196 in _serialize
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 163 in serialize_to_json
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 555 in serialize_dag
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 201 in _serialize
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 165 in serialize_to_json
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 352 in serialize_operator
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 203 in _serialize
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 557 in <listcomp>
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 557 in serialize_dag
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 201 in _serialize
File "/usr/local/src/apache-airflow/airflow/serialization/serialized_objects.py", line 165 in serialize_to_json
File "/usr/local/src/apache-....
```
```
ile "/usr/local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 387, in _fetch_row
return self._result.fetch_row(size, self._fetch_type)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xf1 in position 2686: invalid continuation byte
called from:
Process DagFileProcessor2121-Process:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/usr/local/lib/python3.6/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "/usr/local/src/apache-airflow/airflow/jobs/scheduler_job.py", line 159, in _run_file_processor
pickle_dags)
File "/usr/local/src/apache-airflow/airflow/utils/db.py", line 74, in wrapper
return func(*args, **kwargs)
File "/usr/local/src/apache-airflow/airflow/jobs/scheduler_job.py", line 1609, in process_file
dag.sync_to_db()
File "/usr/local/src/apache-airflow/airflow/utils/db.py", line 74, in wrapper
return func(*args, **kwargs)
File "/usr/local/src/apache-airflow/airflow/models/dag.py", line 1552, in sync_to_db
session=session
File "/usr/local/src/apache-airflow/airflow/utils/db.py", line 70, in wrapper
return func(*args, **kwargs)
File "/usr/local/src/apache-airflow/airflow/models/serialized_dag.py", line 120, in write_dag
session.merge(new_serialized_dag)
```
**What you expected to happen**
Should have worked with the airflow db without causing a coredump.
@kaxil | https://github.com/apache/airflow/issues/10954 | https://github.com/apache/airflow/pull/14742 | cdfa4ee8bf752ee14ae84868aa9ad4b6bc618102 | b40beb3036b8221053fdb7ab537a45afccf0bd8e | 2020-09-15T15:21:40Z | python | 2021-03-12T11:37:00Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,935 | ["airflow/www/templates/airflow/dags.html"] | Sensing stats hidden in the Webserver on the Home Page |
**Apache Airflow version**: Master (2.0.0dev)
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Macos
- **Kernel** (e.g. `uname -a`): Macbook Pro
**What happened**:
The sensing state is the last circle in recent task which is half-hidden


| https://github.com/apache/airflow/issues/10935 | https://github.com/apache/airflow/pull/10939 | d24b8f69e77466b2f7eb06a747e8f04978e4baa4 | aab95994b82bdde7a06d449471b2eaac4de7330f | 2020-09-14T14:29:30Z | python | 2020-09-14T18:49:57Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,929 | ["airflow/providers/qubole/example_dags/example_qubole.py", "airflow/providers/qubole/operators/qubole.py", "airflow/providers/qubole/operators/qubole_check.py", "airflow/providers/qubole/provider.yaml", "airflow/providers/qubole/sensors/qubole.py", "docs/apache-airflow-providers-qubole/index.rst", "docs/apache-airflow-providers-qubole/operators/index.rst", "docs/apache-airflow-providers-qubole/operators/qubole.rst", "docs/apache-airflow-providers-qubole/operators/qubole_check.rst"] | Add how to documentation for Qubole Operator | We are trying to improve our documentation, and I see a guides that explains how to use these operators is missing. Here are guides for other operators: https://airflow.readthedocs.io/en/latest/howto/operator/index.html
Wouldn't you like to write a guide for this operator? In practice, the guide is an example DAG file + description.
_Originally posted by @mik-laj in https://github.com/apache/airflow/issues/9347#issuecomment-691988305_ | https://github.com/apache/airflow/issues/10929 | https://github.com/apache/airflow/pull/20058 | 42f133c5f63011399eb46ee6f046c401103cf546 | 2fbfbef17b2f3c683a9d9de8ced190a13f06712a | 2020-09-14T11:30:03Z | python | 2021-12-06T21:06:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,926 | ["airflow/example_dags/example_xcomargs.py", "airflow/models/baseoperator.py", "airflow/models/taskmixin.py", "airflow/models/xcom_arg.py", "pylintrc"] | Introduce TaskMixin | **Description**
Currently `BaseOperator` and `XComArgs` implements the same logic of `__lshift__`, `__rshift__` and others "chain" operations (`>>`, `<<`). It seems resonable to abstract this logic into `TaskMixin` to introduce some DRYness especially because the new concept of `TaskGroup` will also implement the same methods.
**Use case / motivation**
Limit duplication of logic.
**Related Issues**
https://github.com/apache/airflow/pull/10827#issuecomment-691927501
#10153
| https://github.com/apache/airflow/issues/10926 | https://github.com/apache/airflow/pull/10930 | c9f006b5409b77085cd140e0954501352feb096f | 0779688f448042dc6f1873051545f0317a74dc5a | 2020-09-14T09:55:31Z | python | 2020-09-16T11:56:36Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,921 | ["airflow/providers/databricks/operators/databricks.py", "tests/providers/databricks/operators/test_databricks.py"] | Asynchronous run for DatabricksRunNowOperator | **Description**
Ability to let `DatabricksRunNowOperator` execute asynchronously based on an optional argument `async=False`.
**Use case / motivation**
Sometimes a databricks job would want to perform actions without letting the Dag operator wait for its completion. This would be very useful when the child tasks would periodically want to interact with that running job such as for message queue.
If this is valid, would like to be assigned the same. | https://github.com/apache/airflow/issues/10921 | https://github.com/apache/airflow/pull/20536 | a63753764bce26fd2d13c79fc60df7387b98d424 | 58afc193776a8e811e9a210a18f93dabebc904d4 | 2020-09-14T05:08:47Z | python | 2021-12-28T17:13:17Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,882 | ["airflow/utils/log/logging_mixin.py", "tests/utils/test_logging_mixin.py"] | StreamLogWriter has no close method, clashes with abseil logging (Tensorflow) | **Apache Airflow version**: 1.10.12
**What happened**:
If any Python code run in an operator, has imported `absl.logging` (directly or indirectly), `airflow run` on that task ends with `AttributeError: 'StreamLogWriter' object has no attribute 'close'`. This is not normally seen, as this only happens in the `--raw` inner stage. The task is marked as successful, but the task exited with exit code 1.
The full traceback is:
```python
Traceback (most recent call last):
File "/.../bin/airflow", line 37, in <module>
args.func(args)
File "/.../lib/python3.6/site-packages/airflow/utils/cli.py", line 76, in wrapper
return f(*args, **kwargs)
File "/.../lib/python3.6/site-packages/airflow/bin/cli.py", line 588, in run
logging.shutdown()
File "/.../lib/python3.6/logging/__init__.py", line 1946, in shutdown
h.close()
File "/.../lib/python3.6/site-packages/absl/logging/__init__.py", line 864, in close
self.stream.close()
AttributeError: 'StreamLogWriter' object has no attribute 'close'
Error in atexit._run_exitfuncs:
Traceback (most recent call last):
File "/.../lib/python3.6/logging/__init__.py", line 1946, in shutdown
h.close()
File "/.../lib/python3.6/site-packages/absl/logging/__init__.py", line 864, in close
self.stream.close()
AttributeError: 'StreamLogWriter' object has no attribute 'close'
```
Abseil is Google's utility package, and is used in Tensorflow. The same issue would be seen if you used `import tensorflow`.
**What you expected to happen**:
I expected the task to exit with exit code 0, without issues at exit.
**How to reproduce it**:
- Install abseil: `pip install absl-py`
- Create a test dag with a single operator that only uses `import abs.logging`
- Trigger a dagrun (no scheduler needs to be running)
- execute `airflow run [dagid] [taskid] [execution-date] --raw`
**Anything else we need to know**:
What happens is that `absl.logging` sets up a logger with a custom handler (`absl.logging.ABSLHandler()`, which is a proxy for either `absl.logging.PythonHandler()`), and that proxy will call `.close()` on its `stream`. By default that stream is `sys.stderr`. However, airflow has swapped out `sys.stderr` for a `StreamLogWriter` object when it runs the task under the `airflow.utils.log.logging_mixin.redirect_stderr` context manager. When the context manager exits, the logger is still holding on to the surrogate stderr object.
Normally, the abseil handler would handle such cases, it deliberately won't close a stream that is the same object as `sys.stderr` or `sys.__stderr__` (see [their source code](https://github.com/abseil/abseil-py/blob/06edd9c20592cec39178b94240b5e86f32e19768/absl/logging/__init__.py#L852-L870)). But *at exit time* that's no longer the case. At that time `logging.shutdown()` is called, and that leads to the above exception.
Since `sys.stderr` has a close method, the best fix would be for `StreamLogWriter` to also have one. | https://github.com/apache/airflow/issues/10882 | https://github.com/apache/airflow/pull/10884 | 3ee618623be6079ed177da793b490cb7436d5cb6 | 26ae8e93e8b8075105faec18dc2e6348fa9efc72 | 2020-09-11T14:28:17Z | python | 2020-10-20T08:20:39Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,874 | ["airflow/providers/ssh/operators/ssh.py", "tests/providers/ssh/operators/test_ssh.py"] | SSHHook get_conn() does not re-use client | **Apache Airflow version**: 1.10.8
**Environment**:
- **Cloud provider or hardware configuration**: 4 VCPU 8GB RAM VM
- **OS** (e.g. from /etc/os-release): RHEL 7.7
- **Kernel** (e.g. `uname -a`): `Linux 3.10.0-957.el7.x86_64 #1 SMP Thu Oct 4 20:48:51 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux`
- **Install tools**:
- **Others**:
**What happened**:
Sub-classing the SSHOperator and calling its execute repeatedly will create a new SSH connection each time, to run the command.
Not sure if this is a bug or an enhancement / feature. I can re-log as a feature request if needed.
**What you expected to happen**:
SSH client / connection should be re-used if it was already established.
**How to reproduce it**:
Sub-class the SSHOperator.
In your sub class execute method, call super().execute() a few times.
Observe in the logs how an SSH Connection is created each time.
**Anything else we need to know**:
The SSHHook.get_conn() method creates a new Paramiko SSH client each time. Despite storing the client on self.client before returning, the hook get_conn() method does not actually use the self.client next time. A new connection is therefore created.
I think this is because the SSH Operator uses a context manager to operate on the Paramiko client, so the Hook needs to create a new client if a previous context manager had closed the last one.
Fixing this would mean changing the SSH Operator execute() to not use the ssh_hook.get_conn() as a context manager since this will open and close the session each time. Perhaps the conn can be closed with the operator's post_execute method rather than in the execute.
***Example logs***
```
[2020-09-11 07:04:37,960] {ssh_operator.py:89} INFO - ssh_conn_id is ignored when ssh_hook is provided.
[2020-09-11 07:04:37,960] {logging_mixin.py:112} INFO - [2020-09-11 07:04:37,960] {ssh_hook.py:166} WARNING - Remote Identification Change is not verified. This wont protect against Man-In-The-Middle attacks
[2020-09-11 07:04:37,961] {logging_mixin.py:112} INFO - [2020-09-11 07:04:37,961] {ssh_hook.py:170} WARNING - No Host Key Verification. This wont protect against Man-In-The-Middle attacks
[2020-09-11 07:04:37,976] {logging_mixin.py:112} INFO - [2020-09-11 07:04:37,975] {transport.py:1819} INFO - Connected (version 2.0, client OpenSSH_7.4)
[2020-09-11 07:04:38,161] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,161] {transport.py:1819} INFO - Auth banner: b'Authorized uses only. All activity may be monitored and reported.\n'
[2020-09-11 07:04:38,161] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,161] {transport.py:1819} INFO - Authentication (publickey) successful!
[2020-09-11 07:04:38,161] {ssh_operator.py:109} INFO - Running command: [REDACTED COMMAND 1]
...
[2020-09-11 07:04:38,383] {ssh_operator.py:89} INFO - ssh_conn_id is ignored when ssh_hook is provided.
[2020-09-11 07:04:38,383] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,383] {ssh_hook.py:166} WARNING - Remote Identification Change is not verified. This wont protect against Man-In-The-Middle attacks
[2020-09-11 07:04:38,383] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,383] {ssh_hook.py:170} WARNING - No Host Key Verification. This wont protect against Man-In-The-Middle attacks
[2020-09-11 07:04:38,399] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,399] {transport.py:1819} INFO - Connected (version 2.0, client OpenSSH_7.4)
[2020-09-11 07:04:38,545] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,545] {transport.py:1819} INFO - Auth banner: b'Authorized uses only. All activity may be monitored and reported.\n'
[2020-09-11 07:04:38,546] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,546] {transport.py:1819} INFO - Authentication (publickey) successful!
[2020-09-11 07:04:38,546] {ssh_operator.py:109} INFO - Running command: [REDACTED COMMAND 2]
....
[2020-09-11 07:04:38,722] {ssh_operator.py:89} INFO - ssh_conn_id is ignored when ssh_hook is provided.
[2020-09-11 07:04:38,722] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,722] {ssh_hook.py:166} WARNING - Remote Identification Change is not verified. This wont protect against Man-In-The-Middle attacks
[2020-09-11 07:04:38,723] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,723] {ssh_hook.py:170} WARNING - No Host Key Verification. This wont protect against Man-In-The-Middle attacks
[2020-09-11 07:04:38,734] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,734] {transport.py:1819} INFO - Connected (version 2.0, client OpenSSH_7.4)
[2020-09-11 07:04:38,867] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,867] {transport.py:1819} INFO - Auth banner: b'Authorized uses only. All activity may be monitored and reported.\n'
[2020-09-11 07:04:38,868] {logging_mixin.py:112} INFO - [2020-09-11 07:04:38,867] {transport.py:1819} INFO - Authentication (publickey) successful!
[2020-09-11 07:04:38,868] {ssh_operator.py:109} INFO - Running command: [REDACTED COMMAND 3]
``` | https://github.com/apache/airflow/issues/10874 | https://github.com/apache/airflow/pull/17378 | 306d0601246b43a4fcf1f21c6e30a917e6d18c28 | 73fcbb0e4e151c9965fd69ba08de59462bbbe6dc | 2020-09-11T05:31:27Z | python | 2021-10-13T20:14:54Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,856 | ["BREEZE.rst", "Dockerfile", "Dockerfile.ci", "IMAGES.rst", "breeze", "breeze-complete", "docs/production-deployment.rst", "scripts/ci/libraries/_build_images.sh", "scripts/ci/libraries/_initialization.sh"] | Add better "extensions" model for "build image" part | There should be an easy way to add various build time steps and dependencies in the "build image" segment and it should be quite obvious how to do it.
Example of what kind of extensions should be supported is described here: https://github.com/apache/airflow/issues/8605#issuecomment-690065621 | https://github.com/apache/airflow/issues/10856 | https://github.com/apache/airflow/pull/11176 | 17c810ec36a61ca2e285ccf44de27a598cca15f5 | ebd71508627e68f6c35f1aff2d03b4569de80f4b | 2020-09-10T08:30:58Z | python | 2020-09-29T13:30:00Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,816 | ["BREEZE.rst", "breeze", "docs/apache-airflow/concepts.rst"] | Docs around different execution modes for Sensor | **Description**
It would be good to add docs to explain different modes for Sensors:
1. Poke mode
1. Reschedule mode
1. Smart Sensor
and to explain the advantages of one over the other | https://github.com/apache/airflow/issues/10816 | https://github.com/apache/airflow/pull/12803 | e9b2ff57b81b12cfbf559d957a370d497015acc2 | df9493c288f33c8798d9b02331f01b3a285c03a9 | 2020-09-08T22:51:29Z | python | 2020-12-05T20:08:12Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,815 | ["docs/smart-sensor.rst"] | Mark "Smart Sensor" as an early-access feature | **Description**
Based on our discussion during Airflow 2.0 dev call, the consensus was that **Smart Sensors** ([PR](https://github.com/apache/airflow/pull/5499)) will be included in Airflow 2.0 as an **early-access** feature with a clear note that this feature might potentially change in future Airflow version with breaking changes.
Also, make it clear that Airbnb is running it in PROD since ~6-7 months to give confidence to our users
| https://github.com/apache/airflow/issues/10815 | https://github.com/apache/airflow/pull/11499 | f43d8559fec91e473aa4f67ea262325462de0b5f | e3e8fd896bb28c7902fda917d5b5ceda93d6ac0b | 2020-09-08T22:42:53Z | python | 2020-10-13T15:11:32Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,804 | ["airflow/providers/amazon/aws/transfers/gcs_to_s3.py", "tests/providers/amazon/aws/transfers/test_gcs_to_s3.py"] | Add acl_policy into GCSToS3Operator | **Description**
The goal's feature is to add the `acl_policy` field to the `GCSToS3Operator`
**Use case / motivation**
The `acl_policy` field has been added to the `S3Hook` but not in the `GCSToS3Operator`
| https://github.com/apache/airflow/issues/10804 | https://github.com/apache/airflow/pull/10829 | 03ff067152ed3202b7d4beb0fe9b371a0ef51058 | dd98b21494ff6036242b63268140abe1294b3657 | 2020-09-08T15:28:23Z | python | 2020-10-06T11:09:01Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,794 | ["airflow/models/taskinstance.py", "airflow/sensors/external_task_sensor.py", "tests/models/test_taskinstance.py", "tests/sensors/test_external_task_sensor.py"] | ExternalTaskMarker don't work with store_serialized_dags | **Apache Airflow version**: 1.10.10
**Kubernetes version (if you are using kubernetes)**: 1.14.10
**Environment**:
- **Cloud provider or hardware configuration**: Any, GCP
- **OS** (e.g. from /etc/os-release): Ubuntu 1.10.10
- **Kernel** (e.g. `uname -a`): Any
- **Install tools**: Any
- **Others**: NA
**What happened**:
DAGs with ExternalTaskMarker don't clean external task after second usage of clean on whole DAG
**What you expected to happen**:
All external task should be cleaned
**How to reproduce it**:
enable serialization `store_serialized_dags = True`
create example DAGs:
```
default_args = {'owner': 'airflow',
'start_date': datetime(2018, 1, 1)}
def hello_world_py(*args):
print('Hello World')
print('This is DAG is dep}')
schedule = '@daily'
dag_id = 'dep_dag'
with DAG(dag_id=dag_id,
schedule_interval=schedule,
default_args=default_args) as dag:
t1 = PythonOperator(task_id='hello_world',
python_callable=hello_world_py,)
dep_1 = ExternalTaskSensor(task_id='child_task1',
external_dag_id='hello_world_2',
external_task_id='parent_task',
mode='reschedule')
dep_1 >> t1
def create_dag(dag_id, schedule, dag_number, default_args):
dag = DAG(dag_id, schedule_interval=schedule,
default_args=default_args)
with dag:
t1 = PythonOperator(task_id='hello_world',
python_callable=hello_world_py,
dag_number=dag_number)
parent_task = SerializableExternalTaskMarker(task_id='parent_task',
external_dag_id='dep_dag',
external_task_id='child_task1')
t1 >> parent_task
return dag
for n in range(1, 4):
dag_id = 'hello_world_{}'.format(str(n))
default_args = {'owner': 'airflow',
'start_date': datetime(2018, 1, 1)}
schedule = '@daily'
dag_number = n
globals()[dag_id] = create_dag(dag_id, schedule, dag_number, default_args)
```
1. Run both DAGs
2. Wait until first few dagruns where completed
3. Clean first dugrun in DAG with marker
4. Check external dug was cleaned on this date
5. Mark success this date in each DAGs or wait until complete
6. Clean DAG with marker second time on same date
7. ExternalTaskMarker don't work
**Anything else we need to know**:
I think ExternalTaskMarker don't work because of serialization, after serialization each task instance get operator field equal 'SerializedBaseOperator' and markers logic dot' work [here](https://github.com/apache/airflow/blob/master/airflow/models/dag.py#L1072)
To test ExternalTaskMarker with serialization you can use:
<details>
```
from airflow.sensors.external_task_sensor import ExternalTaskMarker
class FakeName(type):
def __new__(metacls, name, bases, namespace, **kw):
name = namespace.get("__name__", name)
return super().__new__(metacls, name, bases, namespace, **kw)
class SerializableExternalTaskMarker(ExternalTaskMarker, metaclass=FakeName):
# The _serialized_fields are lazily loaded when get_serialized_fields() method is called
__serialized_fields = None # type: Optional[FrozenSet[str]]
__name__ = "ExternalTaskMarker"
@classmethod
def get_serialized_fields(cls):
"""Serialized BigQueryOperator contain exactly these fields."""
if not cls.__serialized_fields:
cls.__serialized_fields = frozenset(
ExternalTaskMarker.get_serialized_fields() | {
"recursion_depth", "external_dag_id", "external_taskid", "execution_date"
}
)
return cls.__serialized_fields
```
</details>
@kaxil | https://github.com/apache/airflow/issues/10794 | https://github.com/apache/airflow/pull/10924 | ce19657ec685abff5871df80c8d47f8585eeed99 | f7da7d94b4ac6dc59fb50a4f4abba69776aac798 | 2020-09-08T07:46:31Z | python | 2020-09-15T22:40:41Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,793 | ["chart/files/pod-template-file.kubernetes-helm-yaml"] | Mounting DAGS from an externally populated PVC doesn't work in K8 Executor | **Apache Airflow version**: 1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.18
**What happened**: Mounting DAGS from an externally populated PVC doesn't work:
```
--set dags.persistence.enabled=true \
--set dags.persistence.existingClaim=my-volume-claim
--set dags.gitSync.enabled=false
```
Envionment variables from K8 Executor worker
> │ Environment:
> ││ AIRFLOW_HOME: /opt/airflow ││ AIRFLOW__CORE__DAGS_FOLDER: /opt/airflow/dags/repo/
> ││ AIRFLOW__CORE__DAG_CONCURRENCY: 5 │
> │ AIRFLOW__CORE__EXECUTOR: LocalExecutor │
> │ AIRFLOW__CORE__FERNET_KEY: <set to the key 'fernet-key' in secret 'airflow-fernet-key'> Optional: false │
> │ AIRFLOW__CORE__PARALLELISM: 5 │
> │ AIRFLOW__CORE__SQL_ALCHEMY_CONN: <set to the key 'connection' in secret 'airflow-airflow-metadata'> Optional: false │
> │ AIRFLOW__KUBERNETES__DAGS_VOLUME_SUBPATH: repo/
<!-- (please include exact error messages if you can) -->
**What you expected to happen**: Dags mounted in workers from PVC
<!-- What do you think went wrong? -->
**How to reproduce it**: Use chart from master and set variables as above
<!---
| https://github.com/apache/airflow/issues/10793 | https://github.com/apache/airflow/pull/13686 | 7ec858c4523b24e7a3d6dd1d49e3813e6eee7dff | 8af5a33950cfe59a38931a8a605394ef0cbc3c08 | 2020-09-08T07:39:29Z | python | 2021-01-17T12:53:11Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,792 | ["airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py"] | Allow labels in KubernetesPodOperator to be templated | **Description**
It would be useful to have labels being a templated field in KubernetesPodOperator, in order for example, to be able to identify pods by run_id when there are multiple concurrent dag runs. | https://github.com/apache/airflow/issues/10792 | https://github.com/apache/airflow/pull/10796 | fd682fd70a97a1f937786a1a136f0fa929c8fb80 | b93b6c5be3ab60960f650d0d4ee6c91271ac7909 | 2020-09-08T07:35:49Z | python | 2020-10-05T08:05:00Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,788 | ["airflow/www/extensions/init_views.py", "docs/apache-airflow/plugins.rst", "tests/plugins/test_plugin.py", "tests/plugins/test_plugins_manager.py"] | Airflow Plugins should have support for views without menu | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
<!-- A short description of your feature -->
This feature requests a way to support custom views in plugins without any menu. Right now, all the views listed in `AirflowPlugin.appbuilder_views` are added with menu to the appbuilder.
**Use case / motivation**
In a custom plugin I built, I need to add a distinct view for details of custom operator. Then I use the `BaseOperator.operator_extra_links` to link this new UI view with the task links.
However, this view has no need to show in the airflow menu, but rather should be shown in UI similarly to `views.DagModelView`. That is, the view should be added to flask appbuilder using `appbuilder.add_view_no_menu` call but right now all the views in `AirflowPlugin.appbuilder_views` are added by calling `appbuilder.add_view`
**What do you want to happen?**
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
Maybe if "name" is missing in the dict in `AirflowPlugin.appbuilder_views` list, then when integrating plugin to flask app context, it'll just call:
```python
appbuilder.add_view_no_menu(v["view"])
```
otherwise the default behavior.
| https://github.com/apache/airflow/issues/10788 | https://github.com/apache/airflow/pull/11742 | 6ef23aff802032e85ec42dabda83907bfd812b2c | 429e54c19217f9e78cba2297b3ab25fa098eb819 | 2020-09-08T06:23:57Z | python | 2021-01-04T06:49:26Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,786 | ["airflow/providers/databricks/operators/databricks.py", "docs/apache-airflow-providers-databricks/operators.rst", "tests/providers/databricks/operators/test_databricks.py"] | DatabricksRunNowOperator missing jar_params as a kwarg | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
DatabricksRunNowOperator is missing the option to take in key word arguement _jar_params_ , it already can take the other ones notebook_params,python_params,spark_submit_params. https://docs.databricks.com/dev-tools/api/latest/jobs.html#run-now
**Use case / motivation**
Provide parity with the other options
| https://github.com/apache/airflow/issues/10786 | https://github.com/apache/airflow/pull/19443 | 854b70b9048c4bbe97abde2252b3992892a4aab0 | 3a0c4558558689d7498fe2fc171ad9a8e132119e | 2020-09-08T00:22:32Z | python | 2021-11-07T19:10:45Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,656 | ["airflow/providers/ssh/operators/ssh.py", "tests/providers/ssh/operators/test_ssh.py"] | Error in SSHOperator " 'NoneType' object has no attribute 'startswith' " | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
This questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.10
**What happened**:
I wrote the following piece of code:
```
from datetime import datetime
from airflow import DAG
from airflow.contrib.operators.ssh_operator import SSHOperator
args = {
'owner': 'airflow',
'start_date': datetime(year=2020, month=7, day=21,
hour=3, minute=0, second=0),
'provide_context': True,
}
dag = DAG(
dag_id='test_ssh_operator',
default_args=args,
schedule_interval='@daily',
)
ssh_command = """
echo 'hello work'
"""
task = SSHOperator(
task_id="check_ssh_perator",
ssh_conn_id='ssh_default',
command=ssh_command,
do_xcom_push=True,
dag=dag,
)
task
```
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
And I got the following error:
`Broken DAG: [/etc/airflow/dags/dag_test_sshoperator.py] 'NoneType' object has no attribute 'startswith'`
<!-- What do you think went wrong? -->
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
I add new ssh connection in Admin--> Connections. and in Extra field, I put the following JSON:
```
{"key_file":"/root/.ssh/airflow-connector/id_ed25519",
"timeout": "10",
"compress": "false",
"no_host_key_check": "false",
"allow_host_key_change": "false"}
```
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/10656 | https://github.com/apache/airflow/pull/11361 | 11eb649d4acdbd3582fb0a77b5f5af3b75e2262c | 27e637fbe3f17737e898774ff151448f4f0aa129 | 2020-08-31T07:49:46Z | python | 2020-10-09T07:35:39Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,646 | ["chart/files/pod-template-file.kubernetes-helm-yaml", "chart/templates/workers/worker-deployment.yaml", "chart/tests/test_pod_template_file.py"] | Kubernetes config dags_volume_subpath breaks PVC in helm chart | **Apache Airflow version**: 1.10.12, master
**Kubernetes version: v1.17.9-eks-4c6976 (server)/ v.1.18.6 (client)
**Environment**:
- **Cloud provider or hardware configuration**: EKS
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
Current logic of setting `dags_volume_subpath` is broken for the following use case:
> Dag loaded from PVC but gitSync disabled.
I am using the chart from apache/airflow master thusly:
```
helm install airflow chart --namespace airflow-dev \
--set dags.persistence.enabled=true \
--set dags.persistence.existingClaim=airflow-dag-pvc \
--set dags.gitSync.enabled=false
```
For the longest time, even with a vanilla install, the workers kept dying. Tailing the logs clued me in that workers were not able to find the dag. I verified from the scheduler that dags were present (it showed up in the UI etc)
Further debugging (looking at the worker pod config) was the main clue ... here is the volumemount
```yaml
- mountPath: /opt/airflow/dags
name: airflow-dags
readOnly: true
subPath: repo/tests/dags
```
Why/who would add `repo/tests/dags` as a subpath?? 🤦♂️
Finally found the problem logic here:
https://github.com/apache/airflow/blob/9b2efc6dcc298e3df4d1365fe809ea1dc0697b3b/chart/values.yaml#L556
Note the implied connection between `dags.persistence.enabled` and `dags.gitSync`! This looks like some leftover code from when gitsync and external dag went hand in hand.
Ideally, an user should be able to use a PVC _without_ using git sync
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
I should be able to use a PVC without gitsync logic messing up my mount path
<!-- What do you think went wrong? -->
See above.
**How to reproduce it**:
I am kinda surprised that this has not bitten anyone yet. I'd like to think my example is essentially a `hello world` of helm chart with an dag from a PVC.
**Anything else we need to know**:
This is the patch that worked for me. Seems fairly reasonable -- only muck with dags_volume_subpath if gitSync is enabled.
Even better would be to audit other code and clearly separate out the different competing use case
1. use PVC but no gitsync
2. use PVC with gitsync
3. use gitsync without PVC
```
diff --git a/chart/values.yaml b/chart/values.yaml
index 00832b435..8f6506cd4 100644
--- a/chart/values.yaml
+++ b/chart/values.yaml
@@ -550,10 +550,10 @@ config:
delete_worker_pods: 'True'
run_as_user: '{{ .Values.uid }}'
fs_group: '{{ .Values.gid }}'
dags_volume_claim: '{{- if .Values.dags.persistence.enabled }}{{ include "airflow_dags_volume_claim" . }}{{ end }}'
- dags_volume_subpath: '{{- if .Values.dags.persistence.enabled }}{{.Values.dags.gitSync.dest }}/{{ .Values.dags.gitSync.subPath }}{{ end }}'
+ dags_volume_subpath: '{{- if .Values.dags.gitSync.enabled }}{{.Values.dags.gitSync.dest }}/{{ .Values.dags.gitSync.subPath }}{{ end }}'
git_repo: '{{- if and .Values.dags.gitSync.enabled (not .Values.dags.persistence.enabled) }}{{ .Values.dags.gitSync.repo }}{{ end }}'
git_branch: '{{ .Values.dags.gitSync.branch }}'
git_sync_rev: '{{ .Values.dags.gitSync.rev }}'
```
| https://github.com/apache/airflow/issues/10646 | https://github.com/apache/airflow/pull/15657 | b1bd59440baa839eccdb2770145d0713ade4f82a | 367d64befbf2f61532cf70ab69e32f596e1ed06e | 2020-08-30T06:24:38Z | python | 2021-05-04T18:40:21Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,636 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/kubernetes/kube_client.py", "docs/spelling_wordlist.txt"] | Kubernetes executors hangs on pod submission | **Apache Airflow version**: 1.10.10, 1.10.11, 1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.15.11, 1.17.7
**Environment**:
- **Cloud provider or hardware configuration**: AKS
- **Others**: Python 3.6, Python 3.7, Python 3.8
Kubernetes executors hangs from time to time on worker pod submission. Pod creation request is timeouting after 15 minutes and scheduler loop continues. I have recreated the problem in several Kubernetes / Python / Airflow version configurations.
py-spy dump:
```
Process 6: /usr/local/bin/python /usr/local/bin/airflow scheduler
Python v3.7.9 (/usr/local/bin/python3.7)
Thread 6 (idle): "MainThread"
read (ssl.py:929)
recv_into (ssl.py:1071)
readinto (socket.py:589)
_read_status (http/client.py:271)
begin (http/client.py:310)
getresponse (http/client.py:1369)
_make_request (urllib3/connectionpool.py:421)
urlopen (urllib3/connectionpool.py:677)
urlopen (urllib3/poolmanager.py:336)
request_encode_body (urllib3/request.py:171)
request (urllib3/request.py:80)
request (kubernetes/client/rest.py:170)
POST (kubernetes/client/rest.py:278)
request (kubernetes/client/api_client.py:388)
__call_api (kubernetes/client/api_client.py:176)
call_api (kubernetes/client/api_client.py:345)
create_namespaced_pod_with_http_info (kubernetes/client/api/core_v1_api.py:6265)
create_namespaced_pod (kubernetes/client/api/core_v1_api.py:6174)
run_pod_async (airflow/contrib/kubernetes/pod_launcher.py:81)
run_next (airflow/contrib/executors/kubernetes_executor.py:486)
sync (airflow/contrib/executors/kubernetes_executor.py:878)
heartbeat (airflow/executors/base_executor.py:134)
_validate_and_run_task_instances (airflow/jobs/scheduler_job.py:1505)
_execute_helper (airflow/jobs/scheduler_job.py:1443)
_execute (airflow/jobs/scheduler_job.py:1382)
run (airflow/jobs/base_job.py:221)
scheduler (airflow/bin/cli.py:1040)
wrapper (airflow/utils/cli.py:75)
<module> (airflow:37)
```
logs:
```
020-08-26 18:26:25,721] {base_executor.py:122} DEBUG - 0 running task instances
[2020-08-26 18:26:25,722] {base_executor.py:123} DEBUG - 1 in queue
[2020-08-26 18:26:25,722] {base_executor.py:124} DEBUG - 32 open slots
[2020-08-26 18:26:25,722] {kubernetes_executor.py:840} INFO - Add task ('ProcessingTask', 'exec_spark_notebook', datetime.datetime(2020, 8, 26, 18, 26, 21, 61159, tzinfo=<
Timezone [UTC]>), 1) with command ['airflow', 'run', 'ProcessingTask', 'exec_spark_notebook', '2020-08-26T18:26:21.061159+00:00', '--local', '--pool', 'default_pool', '-sd
', '/usr/local/airflow/dags/qubole_processing.py'] with executor_config {}
[2020-08-26 18:26:25,723] {base_executor.py:133} DEBUG - Calling the <class 'airflow.contrib.executors.kubernetes_executor.KubernetesExecutor'> sync method
[2020-08-26 18:26:25,723] {kubernetes_executor.py:848} DEBUG - self.running: {('ProcessingTask', 'exec_spark_notebook', datetime.datetime(2020, 8, 26, 18, 26, 21, 61159, t
zinfo=<Timezone [UTC]>), 1): ['airflow', 'run', 'ProcessingTask', 'exec_spark_notebook', '2020-08-26T18:26:21.061159+00:00', '--local', '--pool', 'default_pool', '-sd', '/
usr/local/airflow/dags/qubole_processing.py']}
[2020-08-26 18:26:25,725] {kubernetes_executor.py:471} INFO - Kubernetes job is (('ProcessingTask', 'exec_spark_notebook', datetime.datetime(2020, 8, 26, 18, 26, 21, 61159
, tzinfo=<Timezone [UTC]>), 1), ['airflow', 'run', 'ProcessingTask', 'exec_spark_notebook', '2020-08-26T18:26:21.061159+00:00', '--local', '--pool', 'default_pool', '-sd',
'/usr/local/airflow/dags/qubole_processing.py'], KubernetesExecutorConfig(image=None, image_pull_policy=None, request_memory=None, request_cpu=None, limit_memory=None, limi
t_cpu=None, limit_gpu=None, gcp_service_account_key=None, node_selectors=None, affinity=None, annotations={}, volumes=[], volume_mounts=[], tolerations=None, labels={}))
[2020-08-26 18:26:25,725] {kubernetes_executor.py:474} DEBUG - Kubernetes running for command ['airflow', 'run', 'ProcessingTask', 'exec_spark_notebook', '2020-08-26T18:26
:21.061159+00:00', '--local', '--pool', 'default_pool', '-sd', '/usr/local/airflow/dags/qubole_processing.py']
[2020-08-26 18:26:25,726] {kubernetes_executor.py:475} DEBUG - Kubernetes launching image dpadevairflowacr01.azurecr.io/airflow:1.10.10-20200826-v2
[2020-08-26 18:26:25,729] {pod_launcher.py:79} DEBUG - Pod Creation Request:
{{ POD JSON }}
[2020-08-26 18:26:26,003] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5546)
[2020-08-26 18:26:26,612] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor779-Process, stopped)>
[2020-08-26 18:26:28,148] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5553)
[2020-08-26 18:26:28,628] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor780-Process, stopped)>
[2020-08-26 18:26:31,473] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5560)
[2020-08-26 18:26:32,005] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor781-Process, stopped)>
[2020-08-26 18:26:32,441] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5567)
[2020-08-26 18:26:33,017] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor782-Process, stopped)>
[2020-08-26 18:26:37,501] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5578)
[2020-08-26 18:26:38,044] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor784-Process, stopped)>
[2020-08-26 18:26:38,510] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5585)
[2020-08-26 18:26:39,054] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor785-Process, stopped)>
[2020-08-26 18:26:39,481] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5574)
[2020-08-26 18:26:40,057] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor783-Process, stopped)>
[2020-08-26 18:26:44,549] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5599)
[2020-08-26 18:26:45,108] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor787-Process, stopped)>
[2020-08-26 18:26:45,613] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5606)
[2020-08-26 18:26:46,118] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor788-Process, stopped)>
[2020-08-26 18:26:48,742] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5595)
[2020-08-26 18:26:49,127] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor786-Process, stopped)>
[2020-08-26 18:26:50,596] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5616)
[2020-08-26 18:26:51,151] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor789-Process, stopped)>
[2020-08-26 18:26:51,653] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5623)
[2020-08-26 18:26:52,161] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor790-Process, stopped)>
[2020-08-26 18:26:54,664] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5630)
[2020-08-26 18:26:55,179] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor791-Process, stopped)>
[2020-08-26 18:26:56,645] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5637)
[2020-08-26 18:26:57,194] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor792-Process, stopped)>
[2020-08-26 18:26:57,592] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5644)
[2020-08-26 18:26:58,207] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor793-Process, stopped)>
[2020-08-26 18:27:00,809] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5651)
[2020-08-26 18:27:01,599] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor794-Process, stopped)>
[2020-08-26 18:27:03,124] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5658)
[2020-08-26 18:27:03,615] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor795-Process, stopped)>
[2020-08-26 18:27:04,120] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5665)
[2020-08-26 18:27:04,627] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor796-Process, stopped)>
[2020-08-26 18:27:07,167] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5672)
[2020-08-26 18:27:07,642] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor797-Process, stopped)>
[2020-08-26 18:27:09,125] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5679)
[2020-08-26 18:27:09,654] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor798-Process, stopped)>
[2020-08-26 18:27:10,201] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5686)
[2020-08-26 18:27:10,664] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor799-Process, stopped)>
[2020-08-26 18:27:15,149] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5697)
[2020-08-26 18:27:15,706] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor801-Process, stopped)>
[2020-08-26 18:27:16,128] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5704)
[2020-08-26 18:27:16,717] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor802-Process, stopped)>
[2020-08-26 18:27:18,167] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5693)
[2020-08-26 18:27:18,725] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor800-Process, stopped)>
[2020-08-26 18:27:21,200] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5714)
[2020-08-26 18:27:21,751] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor803-Process, stopped)>
[2020-08-26 18:27:22,221] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5721)
[2020-08-26 18:27:22,760] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor804-Process, stopped)>
[2020-08-26 18:27:24,192] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5728)
[2020-08-26 18:27:24,773] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor805-Process, stopped)>
[2020-08-26 18:27:27,249] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5735)
[2020-08-26 18:27:27,787] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor806-Process, stopped)>
[2020-08-26 18:27:28,246] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5742)
[2020-08-26 18:27:28,798] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor807-Process, stopped)>
[2020-08-26 18:27:30,318] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5749)
[2020-08-26 18:27:30,810] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor808-Process, stopped)>
[2020-08-26 18:27:33,747] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5756)
[2020-08-26 18:27:34,260] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor809-Process, stopped)>
[2020-08-26 18:27:34,670] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5763)
[2020-08-26 18:27:35,271] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor810-Process, stopped)>
[2020-08-26 18:27:36,765] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5770)
[2020-08-26 18:27:37,286] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor811-Process, stopped)>
[2020-08-26 18:27:39,802] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5777)
[2020-08-26 18:27:40,304] {scheduler_job.py:268} DEBUG - Waiting for <Process(DagFileProcessor812-Process, stopped)>
[2020-08-26 18:27:45,820] {settings.py:278} DEBUG - Disposing DB connection pool (PID 5784)
[2020-08-26 18:42:04,209] {kubernetes_executor.py:885} WARNING - HTTPError when attempting to run task, re-queueing. Exception: HTTPSConnectionPool(host='10.2.0.1', port=443
): Read timed out. (read timeout=None)
[2020-08-26 18:42:04,211] {scheduler_job.py:1450} DEBUG - Heartbeating the scheduler
[2020-08-26 18:42:04,352] {base_job.py:200} DEBUG - [heartbeat]
[2020-08-26 18:42:04,353] {scheduler_job.py:1459} DEBUG - Ran scheduling loop in 938.71 seconds
[2020-08-26 18:42:04,353] {scheduler_job.py:1462} DEBUG - Sleeping for 1.00 seconds
[2020-08-26 18:42:05,354] {scheduler_job.py:1425} DEBUG - Starting Loop...
[2020-08-26 18:42:05,355] {scheduler_job.py:1436} DEBUG - Harvesting DAG parsing results
[2020-08-26 18:42:05,355] {dag_processing.py:648} DEBUG - Received message of type SimpleDag
[2020-08-26 18:42:05,355] {dag_processing.py:648} DEBUG - Received message of type DagParsingStat
[2020-08-26 18:42:05,356] {dag_processing.py:648} DEBUG - Received message of type DagParsingStat
```
How I can configure Airflow to emit logs from imported packages ? I would like to check `urllib3` and `http.client` logs in order to understand the problem. Airflow scheduler logs shows only Airflow codebase logs.
| https://github.com/apache/airflow/issues/10636 | https://github.com/apache/airflow/pull/11406 | 32f2a458198f50b85075d72a25d7de8a55109e44 | da565c9019c72e5c2646741e3b73f6c03cb3b485 | 2020-08-28T18:21:52Z | python | 2020-10-12T15:19:20Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,620 | ["airflow/providers/amazon/aws/operators/ecs.py", "tests/providers/amazon/aws/operators/test_ecs.py"] | Reattach ECS Task when Airflow restarts | **Description**
In similar fashion to https://github.com/apache/airflow/pull/4083, it would be helpful for Airflow to reattach itself to the ECS Task rather than letting another instance to start. However, instead of making this the default behavior, it would be better to use a `reattach` flag.
**Use case / motivation**
Allow Airflow the option to reattach to an existing ECS task when a restart happens, which would avoid having "rogue" tasks. | https://github.com/apache/airflow/issues/10620 | https://github.com/apache/airflow/pull/10643 | e4c239fc98d4b13608b0bbb55c503b4563249300 | 0df60b773671ecf8d4e5f582ac2be200cf2a2edd | 2020-08-28T02:19:18Z | python | 2020-10-23T07:10:07Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,611 | ["airflow/www/views.py"] | Graph View shows other relations than in DAG | **Apache Airflow version**:
master
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**: breeze
**What happened**:
This DAG
```python
from airflow import models
from airflow.operators.dummy_operator import DummyOperator
from airflow.utils.dates import days_ago
with models.DAG("test", start_date=days_ago(1), schedule_interval=None,) as dag:
t1 = DummyOperator(task_id="t1")
t2 = DummyOperator(task_id="t2")
t1 >> t2
```
is rendering like that:
<img width="1374" alt="Screenshot 2020-08-27 at 19 59 41" src="https://user-images.githubusercontent.com/9528307/91478403-11d7fb00-e8a0-11ea-91d0-d7d578bcb5a2.png">
**What you expected to happen**:
I expect to see same relations as defined in DAG file
**How to reproduce it**:
Render the example DAG from above.
**Anything else we need to know**:
I'm surprised by this bug 👀
| https://github.com/apache/airflow/issues/10611 | https://github.com/apache/airflow/pull/10612 | 775c22091e61e605f9572caabe160baa237cfbbd | 479d6220b7d0c93d5ad6a7d53d875e777287342b | 2020-08-27T18:02:43Z | python | 2020-08-27T19:14:15Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,605 | ["airflow/providers/cncf/kubernetes/hooks/kubernetes.py", "airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", "airflow/providers/cncf/kubernetes/utils/xcom_sidecar.py", "docs/apache-airflow-providers-cncf-kubernetes/connections/kubernetes.rst", "kubernetes_tests/test_kubernetes_pod_operator.py", "tests/providers/cncf/kubernetes/hooks/test_kubernetes.py", "tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py"] | Use private docker repository with K8S operator and XCOM sidecar container | **Use private docker repository with K8S operator and XCOM sidecar container**
An extra parameter to KubernetesPodOperator: docker_repository, this allows to specify the repository where the sidecar container is located
**My company force docker proxy usage for K8S**
I need to use my company docker repository, images that are not proxifed by the company docker repository are not allowed
| https://github.com/apache/airflow/issues/10605 | https://github.com/apache/airflow/pull/26766 | 409a4de858385c14d0ea4f32b8c4ad1fcfb9d130 | aefadb8c5b9272613d5806b054a1b46edf29d82e | 2020-08-27T15:35:58Z | python | 2022-11-09T06:16:53Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,586 | ["airflow/kubernetes/pod_launcher.py", "kubernetes_tests/test_kubernetes_pod_operator.py", "tests/kubernetes/test_pod_launcher.py"] | KubernetesPodOperator truncates logs | **Apache Airflow version**: 1.10.10
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): v1.15.11
KubernetesPodOperator truncates logs when container produces more than 10 lines of logs before execution of `read_pod_logs` function. Is there any reason to make 10 as default value for `tail_lines` argument ?
```python
def read_pod_logs(self, pod: V1Pod, tail_lines: int = 10):
"""Reads log from the POD"""
try:
return self._client.read_namespaced_pod_log(
name=pod.metadata.name,
namespace=pod.metadata.namespace,
container='base',
follow=True,
tail_lines=tail_lines,
_preload_content=False
)
except BaseHTTPError as e:
raise AirflowException(
'There was an error reading the kubernetes API: {}'.format(e)
)
``` | https://github.com/apache/airflow/issues/10586 | https://github.com/apache/airflow/pull/11325 | 6fe020e105531dd5a7097d8875eac0f317045298 | b7404b079ab57b6493d8ddd319bccdb40ff3ddc5 | 2020-08-26T16:43:38Z | python | 2020-10-09T22:59:47Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,555 | ["BREEZE.rst", "Dockerfile", "Dockerfile.ci", "IMAGES.rst", "breeze", "breeze-complete", "docs/production-deployment.rst", "scripts/ci/libraries/_build_images.sh", "scripts/ci/libraries/_initialization.sh"] | Allow installation of apt and other packages from different servers | **Description**
By default we are installing apt deps and PI deps from diferent repositories, but there shoudl be an option (via build-arg) to install it from elsewhere.
**Use case / motivation**
Corporate customers often use mirrors of registries to install packages and firewall outgoing connections. We should be able to support such scenarios.
| https://github.com/apache/airflow/issues/10555 | https://github.com/apache/airflow/pull/11176 | 17c810ec36a61ca2e285ccf44de27a598cca15f5 | ebd71508627e68f6c35f1aff2d03b4569de80f4b | 2020-08-25T18:24:07Z | python | 2020-09-29T13:30:00Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,549 | ["airflow/www/extensions/init_views.py", "airflow/www/static/js/circles.js", "airflow/www/templates/airflow/not_found.html", "airflow/www/views.py", "airflow/www/webpack.config.js"] | option to disable "lots of circles" in error page | **Description**
The "lots of circles" error page is very rough via Remote Desktop connections. In the current global pandemic many people are working remotely via already constrained connections. Needless redraws caused by the highly animated circles can cause frustrating slowdowns and sometimes lost connections.
**Use case / motivation**
It should be trivially simple to disable the animated portion of the error page and instead use a standard error page. Ideally this would be something easily achievable via configuration options and exposed in the Helm chart.
**Related Issues**
N/A
| https://github.com/apache/airflow/issues/10549 | https://github.com/apache/airflow/pull/17501 | 7b4ce7b73746466133a9c93e3a68bee1e0f7dd27 | 2092988c68030b91c79a9631f0482ab01abdba4d | 2020-08-25T13:25:33Z | python | 2021-08-13T00:54:09Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,523 | ["docs/apache-airflow/kubernetes.rst", "docs/apache-airflow/production-deployment.rst"] | Host Airflow-managed Helm repo | **Description**
@mik-laj Hi 👋 I was just coming to this repo to see if you're interested in help getting the Helm chart repo set up to replace the chart in `stable`.
Stable repo is nearing the end of it's deprecation period, and I'm glad to see a version of the Airflow chart is already here. See https://github.com/helm/charts/issues/21103 for the meta issue tracking moving all the `stable` repo charts to new homes.
### To-do
- [ ] Decide on hosting options below (self-host/leverage Bitnami)
- [ ] If self-host, set up CI/CD for chart-testing and chart-releasing
- [ ] if self-host, list in Helm Hub (http://hub.helm.sh/) and Artifact Hub (https://artifacthub.io/)
**Use case / motivation**
### Set up Helm repo hosting for your chart
Set up a Helm repo, either as a separate git repo in your org, or keeping the same setup you have now. We have created Helm chart repo actions for chart testing (CI) and releasing chart packages as your own GitHub-hosted Helm repo (CD).
#### Self-hosted options:
1. If we either move the chart to a separate git repo in the artifacthub gh org, or even move the hub github pages setting to a branch other than the main one, we can use the [@helm/chart-releaser-action](https://github.com/helm/chart-releaser-action) GitHub Action to automate the helm repo.
2. If we keep structure as-is, we can still use the [helm/chart-releaser](https://github.com/helm/chart-releaser) project, just with a custom script.
For either option we can also use the [@helm/chart-testing-action](https://github.com/helm/chart-testing-action) to wrap the chart-testing project @mattfarina mentioned above. Here's an demo repo to see how they work together: https://github.com/helm/charts-repo-actions-demo
Whichever option you decide I'm make a PR if it helps.
If you do decide to host your own Helm repo, you will also want to list it in
#### Alternatively leverage existing Bitnami Helm repo
There is also a version of the chart maintained by Bitnami, who have been very involved in the `stable` repo for years, but : https://github.com/bitnami/charts/tree/master/bitnami/airflow. You could instead decide to leverage that chart as the canonical source, and not host your own. It is also fine to have multiple instances of a chart to install the same app.
**Related Issues**
- https://github.com/helm/charts/issues/21103
- https://github.com/apache/airflow/issues/10486
- https://github.com/apache/airflow/issues/10379 | https://github.com/apache/airflow/issues/10523 | https://github.com/apache/airflow/pull/16014 | ce358b21533eeb7a237e6b0833872bf2daab7e30 | 74821c8d999fad129b905a8698df7c897941e069 | 2020-08-24T19:57:06Z | python | 2021-05-23T19:10:51Z |
closed | apache/airflow | https://github.com/apache/airflow | 10,519 | ["airflow/www/utils.py", "airflow/www/views.py", "tests/www/test_views.py"] | Trigger Dag requires a JSON conf but Dag Run view display a python dict | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
This questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.11
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Windows 10 WSL (Ubuntu18.04.4)
- **Kernel** (e.g. `uname -a`): Linux DESKTOP-8IVSCHM 4.4.0-18362-Microsoft #836-Microsoft Mon May 05 16:04:00 PST 2020 x86_64 x86_64 x86_64 GNU/Linux
- **Install tools**:
- **Others**:
**What happened**:
In the __Trigger Dag__ view for a specific Dag, the view asks for JSON formatted object as inpunt.
In the __Dag Runs__ view (list), it shows python formatted object in the __Conf__ column.
Despite JSON and Python formating being quite similar they differ in respect to string quotation marks: json uses double-quotes(") and python uses single quote(').
This makes annoying to copy a previously used config to a new trigger.
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
I would expect a consistent read/write of Dag Run configuration.
In particular require a Json in the Trigger Dag view, and display a Json in the DAG Runs view.
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
1. Trigger a DAG manually passing a Json dict `{"test": "this is a test"}`.
2. Go to Dag Runs view, it shows as `{'test': 'this is a test'}`
**Anything else we need to know**:
This is not application breaking issue, more a quality of life/usability issue.
I imagine it would be a minor change and could be tagged as an first issue
| https://github.com/apache/airflow/issues/10519 | https://github.com/apache/airflow/pull/10644 | 596bc1337988f9377571295ddb748ef8703c19c0 | e6a0a5374dabc431542113633148445e4c5159b9 | 2020-08-24T18:22:17Z | python | 2020-08-31T13:31:58Z |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.