(#21741), Change the default auth backend to session (#21640), Dont check if py DAG files are zipped during parsing (#21538), Switch XCom implementation to use run_id (#20975), Implement multiple API auth backends (#21472), Change logging level details of connection info in get_connection() (#21162), Support mssql in airflow db shell (#21511), Support config worker_enable_remote_control for celery (#21507), Log memory usage in CgroupTaskRunner (#21481), Modernize DAG-related URL routes and rename tree to grid (#20730), Move Zombie detection to SchedulerJob (#21181), Improve speed to run airflow by 6x (#21438), Add more SQL template fields renderers (#21237), Log context only for default method (#21244), Log trigger status only if at least one is running (#21191), Add optional features in providers. The change in GCP operators implies that GCP Hooks for those operators require now keyword parameters rather No change is needed if only the default trigger rule all_success is being used. Now the py_interpreter argument for DataFlow Hooks/Operators has been changed from python2 to python3. BaseOperator class uses a BaseOperatorMeta as a metaclass. The current webserver UI uses the Flask-Admin extension. package was supported by the community. (#4685), [AIRFLOW-251] Add option SQL_ALCHEMY_SCHEMA parameter to specify schema for metadata (#4199), [AIRFLOW-1814] Temple PythonOperator {op_args,op_kwargs} fields (#4691), [AIRFLOW-3730] Standarization use of logs mechanisms (#4556), [AIRFLOW-3770] Validation of documentation on CI] (#4593), [AIRFLOW-3866] Run docker-compose pull silently in CI (#4688), [AIRFLOW-3685] Move licence header check (#4497), [AIRFLOW-3670] Add stages to Travis build (#4477), [AIRFLOW-3937] KubernetesPodOperator support for envFrom configMapRef and secretRef (#4772), [AIRFLOW-3408] Remove outdated info from Systemd Instructions (#4269), [AIRFLOW-3202] add missing documentation for AWS hooks/operator (#4048), [AIRFLOW-3908] Add more Google Cloud Vision operators (#4791), [AIRFLOW-2915] Add example DAG for GoogleCloudStorageToBigQueryOperator (#3763), [AIRFLOW-3062] Add Qubole in integration docs (#3946), [AIRFLOW-3288] Add SNS integration (#4123), [AIRFLOW-3148] Remove unnecessary arg parameters in RedshiftToS3Transfer (#3995), [AIRFLOW-3049] Add extra operations for Mongo hook (#3890), [AIRFLOW-3559] Add missing options to DatadogHook. use SSHOperator class in place of SSHExecuteOperator which is removed now. The task is eligible for retry without going into FAILED state. extra still work), [AIRFLOW-5487] Fix unused warning var (#6111), [AIRFLOW-5925] Relax funcsigs and psutil version requirements (#6580), [AIRFLOW-5740] Fix Transient failure in Slack test (#6407), [AIRFLOW-6058] Running tests with pytest (#6472), [AIRFLOW-6066] Added pre-commit checks for accidental debug stmts (#6662), [AIRFLOW-6060] Improve conf_vars context manager (#6658), [AIRFLOW-6044] Standardize the Code Structure in kube_pod_operator.py (#6639), [AIRFLOW-4940] Simplify tests of DynamoDBToS3Operator (#6836), [AIRFLOW-XXX] Update airflow-jira release management script (#6772), [AIRFLOW-XXX] Add simple guidelines to unit test writing (#6846), [AIRFLOW-6309] Fix stable build on Travis, [AIRFLOW-6211] Doc how to use conda for local virtualenv (#6766), [AIRFLOW-5855] Fix broken reference in custom operator doc (#6508), [AIRFLOW-5875] Fix typo in example_qubole_operator.py (#6525), [AIRFLOW-5702] Fix common docstring issues (#6372), [AIRFLOW-5640] Document and test email parameters of BaseOperator (#6315), [AIRFLOW-XXX] Improve description OpenFaaS Hook (#6187), [AIRFLOW-XXX] GSoD: How to make DAGs production ready (#6515), [AIRFLOW-XXX] Use full command in examples (#5973), [AIRFLOW-XXX] Update docs to accurately describe the precedence of remote and local logs (#5607), [AIRFLOW-XXX] Fix example extras field in mysql connect doc (#5285), [AIRFLOW-XXX] Fix wrong inline code highlighting in docs (#5309), [AIRFLOW-XXX] Group executors in one section (#5834), [AIRFLOW-XXX] Add task lifecycle diagram to documentation (#6762), [AIRFLOW-XXX] Highlight code blocks (#6243), [AIRFLOW-XXX] Documents about task_concurrency and pool (#5262), [AIRFLOW-XXX] Fix incorrect docstring parameter (#6649), [AIRFLOW-XXX] Add link to XCom section in concepts.rst (#6791), [AIRFLOW-XXX] Update kubernetes doc with correct path (#6774), [AIRFLOW-XXX] Add information how to configure pytest runner (#6736), [AIRFLOW-XXX] More GSOD improvements (#6585), [AIRFLOW-XXX] Clarified a grammatically incorrect sentence (#6667), [AIRFLOW-XXX] Add notice for Mesos Executor deprecation in docs (#6712), [AIRFLOW-XXX] Update list of pre-commits (#6603), [AIRFLOW-XXX] Updates to Breeze documentation from GSOD (#6285), [AIRFLOW-XXX] Clarify daylight savings time behavior (#6324), [AIRFLOW-XXX] GSoD: Adding Create a custom operator doc (#6348), [AIRFLOW-XXX] Add resources & links to CONTRIBUTING.rst (#6405), [AIRFLOW-XXX] Update chat channel details from gitter to slack (#4149), [AIRFLOW-XXX] Add logo info to readme (#6349), [AIRFLOW-XXX] Fixed case problem with CONTRIBUTING.rst (#6329), [AIRFLOW-XXX] Google Season of Docs updates to CONTRIBUTING doc (#6283). (#15999), Add memory usage warning in quick-start documentation (#15967), Update example KubernetesExecutor git-sync pod template file (#15904), Added new pipeline example for the tutorial docs (#16084), Updating the DAG docstring to include render_template_as_native_obj (#16534), Docs: Fix API verb from POST to PATCH (#16511), Renaming variables to be consistent with code logic (#18685), Simplify strings previously split across lines (#18679), fix exception string of BranchPythonOperator (#18623), Add multiple roles when creating users (#18617), Move FABs base Security Manager into Airflow. ), [AIRFLOW-1256] Add United Airlines to readme, [AIRFLOW-1251] Add eRevalue to Airflow users, [AIRFLOW-908] Print hostname at the start of cli run, [AIRFLOW-1237] Fix IN-predicate sqlalchemy warning, [AIRFLOW-1243] DAGs table has no default entries to show, [AIRFLOW-1245] Fix random failure in test_trigger_dag_for_date, [AIRFLOW-1248] Fix wrong conf name for worker timeout, [AIRFLOW-1197] : SparkSubmitHook on_kill error, [AIRFLOW-1191] : SparkSubmitHook custom cmd, [AIRFLOW-1234] Cover utils.operator_helpers with UTs, [AIRFLOW-645] Support HTTPS connections in HttpHook, [AIRFLOW-1232] Remove deprecated readfp warning, [AIRFLOW-1233] Cover utils.json with unit tests, [AIRFLOW-1227] Remove empty column on the Logs view, [AIRFLOW-1226] Remove empty column on the Jobs view, [AIRFLOW-1221] Fix templating bug with DatabricksSubmitRunOperator, [AIRFLOW-1210] Enable DbApiHook unit tests, [AIRFLOW-1200] Forbid creation of a variable with an empty key, [AIRFLOW-1207] Enable utils.helpers unit tests, [AIRFLOW-1213] Add hcatalog parameters to sqoop, [AIRFLOW-1201] Update deprecated nose-parameterized, [AIRFLOW-1186] Sort dag.get_task_instances by execution_date, [AIRFLOW-1203] Pin Google API client version to fix OAuth issue. The old configuration is still works but can be abandoned at any time. If your config contains the old default values they will be upgraded-in-place. The REMOTE_BASE_LOG_FOLDER configuration key in your airflow config has been removed, therefore you will need to take the following steps: Copy the logging configuration from airflow/config_templates/airflow_logging_settings.py. (#4760), [AIRFLOW-3932] Optionally skip dag discovery heuristic. Instead, it now accepts: table - will render the output in predefined table. behavior can be overridden by sending replace_microseconds=true along with an explicit execution_date. The above code returned None previously, now it will return ''. The ASF licenses this file, # to you under the Apache License, Version 2.0 (the, # "License"); you may not use this file except in compliance, # with the License. Meanwhile, a task instance with depends_on_past=True installing Airflow, example for Python 3.8 and Airflow 2.1.2: TriggerRule.NONE_FAILED_OR_SKIPPED is replaced by TriggerRule.NONE_FAILED_MIN_ONE_SUCCESS. Each dag now has two permissions(one for write, one for read) associated(can_dag_edit, can_dag_read). (#15210), Make task ID on legend have enough width and width of line chart to be 100%. If a hotfix release becomes available after you've already built an Astronomer Certified image for the first time, subsequent code pushes do not automatically pull the latest corresponding hotfix. Previously, BaseSensorOperator with setting soft_fail=True skips itself To take advantage of both features as well as bug and security fixes, we recommend regularly upgrading to the latest version of AC. Check https://airflow.apache.org/docs/1.10.13/howto/custom-operator.html to see how you can create and import Previously, clearing a running task sets its state to SHUTDOWN. For example, if you used the defaults in 2.2.5: In v2.2 we deprecated passing an execution date to XCom.get methods, but there was no other option for operator links as they were only passed an execution_date.
Built-in operator classes that use this dep class (including sensors and all subclasses) already have this attribute and are not affected.
library and installed it if it does not exist. (#17078), Update chain() and cross_downstream() to support XComArgs (#16732), When a task instance fails with exception, log it (#16805), Set process title for serve-logs and LocalExecutor (#16644), Rename test_cycle to check_cycle (#16617), Add schema as DbApiHook instance attribute (#16521, #17423), Add transparency for unsupported connection type (#16220), Replace deprecated dag.sub_dag with dag.partial_subset (#16179), Treat AirflowSensorTimeout as immediate failure without retrying (#12058), Marking success/failed automatically clears failed downstream tasks (#13037), Add close/open indicator for import dag errors (#16073), Always return a response in TIs action_clear view (#15980), Add cli command to delete user by email (#15873), Use resource and action names for FAB permissions (#16410), Rename DAG concurrency ([core] dag_concurrency) settings for easier understanding (#16267, #18730), Refactor: SKIPPED should not be logged again as SUCCESS (#14822), Remove version limits for dnspython (#18046, #18162), Accept custom run ID in TriggerDagRunOperator (#18788), Make REST API patch user endpoint work the same way as the UI (#18757), Properly set start_date for cleared tasks (#18708), Ensure task_instance exists before running update on its state(REST API) (#18642), Make AirflowDateTimePickerWidget a required field (#18602), Retry deadlocked transactions on deleting old rendered task fields (#18616), Fix retry_exponential_backoff divide by zero error when retry delay is zero (#17003), Improve how UI handles datetimes (#18611, #18700), Bugfix: dag_bag.get_dag should return None, not raise exception (#18554), Only show the task modal if it is a valid instance (#18570), Fix accessing rendered {{ task.x }} attributes from within templates (#18516), Add missing email type of connection (#18502), Dont use flash for same-page UI messages. You should still pay attention to the changes that Details in the SQLAlchemy Changelog. update_dataset requires now new fields argument (breaking change), delete_dataset has new signature (dataset_id, project_id, ) iframe. you have to run the help command: airflow celery --help. The region now needs to be set manually, either in the connection screens in folder was /var/dags/ and your airflowignore contained /var/dag/excluded/, you should change it Airflow should construct dagruns using run_type and execution_date, creation using If you have this issue please report it on the mailing list. If you previously had a plugins/my_plugin.py and you used it like this in a DAG: The name under airflow.operators. Its function has been unified under a common name (do_xcom_push) on BaseOperator.
variant. by application logic, but was not enforced by the database schema. The default In order to increase the robustness of the scheduler, DAGs are now processed in their own process. The new The Software UI and CLI only provide Airflow versions that are later than the version currently running in your Dockerfile. Since 1.10.12, when such skipped tasks are cleared, This resulted in unfortunate characteristics, e.g. [AIRFLOW-925] Revert airflow.hooks change that cherry-pick picked, [AIRFLOW-919] Running tasks with no start date should not break a DAGs UI, [AIRFLOW-802][AIRFLOW-1] Add spark-submit operator/hook, [AIRFLOW-725] Use keyring to store credentials for JIRA, [AIRFLOW-916] Remove deprecated readfp function, [AIRFLOW-911] Add coloring and timing to tests, [AIRFLOW-906] Update Code icon from lightning bolt to file, [AIRFLOW-897] Prevent dagruns from failing with unfinished tasks, [AIRFLOW-896] Remove unicode to 8-bit conversion in BigQueryOperator, [AIRFLOW-899] Tasks in SCHEDULED state should be white in the UI instead of black, [AIRFLOW-895] Address Apache release incompliancies, [AIRFLOW-893][AIRFLOW-510] Fix crashing webservers when a dagrun has no start date, [AIRFLOW-880] Make webserver serve logs in a sane way for remote logs, [AIRFLOW-889] Fix minor error in the docstrings for BaseOperator, [AIRFLOW-809][AIRFLOW-1] Use __eq__ ColumnOperator When Testing Booleans, [AIRFLOW-875] Add template to HttpSensor params, [AIRFLOW-881] Check if SubDagOperator is in DAG context manager, [AIRFLOW-885] Add change.org to the users list, [AIRFLOW-836] Use POST and CSRF for state changing endpoints, [AIRFLOW-862] Fix Unit Tests for DaskExecutor, [AIRFLOW-886] Pass result to post_execute() hook, [AIRFLOW-871] change logging.warn() into warning(), [AIRFLOW-882] Remove unnecessary dag>>op assignment in docs, [AIRFLOW-861] Make pickle_info endpoint be login_required, [AIRFLOW-869] Refactor mark success functionality, [AIRFLOW-877] Remove .sql template extension from GCS download operator, [AIRFLOW-842] Do not query the DB with an empty IN clause, [AIRFLOW-834] Change raise StopIteration into return, [AIRFLOW-832] Let debug server run without SSL, [AIRFLOW-858] Configurable database name for DB operators, [AIRFLOW-863] Example DAGs should have recent start dates, [AIRFLOW-853] Use utf8 encoding for stdout line decode, [AIRFLOW-857] Use library assert statements instead of conditionals, [AIRFLOW-856] Make sure execution date is set for local client, [AIRFLOW-830][AIRFLOW-829][AIRFLOW-88] Reduce Travis log verbosity, [AIRFLOW-814] Fix Presto*CheckOperator.__init__, [AIRFLOW-793] Enable compressed loading in S3ToHiveTransfer, [AIRFLOW-844] Fix cgroups directory creation, [AIRFLOW-831] Restore import to fix broken tests, [AIRFLOW-794] Access DAGS_FOLDER and SQL_ALCHEMY_CONN exclusively from settings, [AIRFLOW-694] Fix config behaviour for empty envvar, [AIRFLOW-365] Set dag.fileloc explicitly and use for Code view, [AIRFLOW-781] Allow DataFlowOperators to accept jobs stored in GCS, Pin Hive and Hadoop to a specific version and create writable warehouse dir, [AIRFLOW-1179] Fix pandas 0.2x breaking Google BigQuery change. and skips all its downstream tasks unconditionally, when it fails i.e the trigger_rule of downstream tasks is not Fix module path of send_email_smtp in configuration, Fix SSHExecuteOperator crash when using a custom ssh port, Add note about Airflow components to template, Make SchedulerJob not run EVERY queued task, Improve BackfillJob handling of queued/deadlocked tasks, Introduce ignore_depends_on_past parameters, Rename user table to users to avoid conflict with postgres, Add support for calling_format from boto to S3_Hook, Add PyPI meta data and sync version number, Set dags_are_paused_at_creations default value to True, Resurface S3Log class eaten by rebase/push -f, Add missing session.commit() at end of initdb, Validate that subdag tasks have pool slots available, and test, Use urlparse for remote GCS logs, and add unit tests, Make webserver worker timeout configurable, Use psycopg2s API for serializing postgres cell values, Make the provide_session decorator more robust, use num_shards instead of partitions to be consistent with batch ingestion, Update docs with separate configuration section, Fix airflow.utils deprecation warning code being Python 3 incompatible, Extract dbapi cell serialization into its own method, Set Postgres autocommit as supported only if server version is < 7.4, Use refactored utils module in unit test imports, remove unused logging,errno, MiniHiveCluster imports, Refactoring utils into smaller submodules, Properly measure number of task retry attempts, Add function to get configuration as dict, plus unit tests, [hotfix] make email.Utils > email.utils for py3, Add the missing Date header to the warning e-mails, Check name of SubDag class instead of class itself, [hotfix] removing repo_token from .coveralls.yml, Add unit tests for trapping Executor errors, Fix HttpOpSensorTest to use fake request session, Add an example on pool usage in the documentation. It will also now be possible to have the execution_date generated, but (#14577), Dont create unittest.cfg when not running in unit test mode (#14420), Webserver: Allow Filtering TaskInstances by queued_dttm (#14708), Update Flask-AppBuilder dependency to allow 3.2 (and all 3.x series) (#14665), Remember expanded task groups in browser local storage (#14661), Add plain format output to cli tables (#14546), Make airflow dags show command display TaskGroups (#14269), Increase maximum size of extra connection field. The deprecated extras will be removed in 3.0. If you are logging to Google cloud storage, please see the Google cloud platform documentation for logging instructions. This guide provides a reference of all required tools and versions for running Astronomer Software. These two flags are close siblings a JSON-encoded Python dict. NULL has been treated depending on value of allow_nullparameter. FROM quay.io/astronomer/astro-runtime:5.0.4. In previous versions of Airflow it was possible to use plugins to load custom executors. The FAB 4. There is no need to explicitly provide or not provide the context anymore. This patch changes the User.superuser field from a hard-coded boolean to a Boolean() database column. (#4732), [AIRFLOW-XXX] Update kubernetes.rst docs (#3875), [AIRFLOW-XXX] Improvements to formatted content in documentation (#4835), [AIRFLOW-XXX] Add Daniel to committer list (#4961), [AIRFLOW-XXX] Add Xiaodong Deng to committers list, [AIRFLOW-XXX] Add history become ASF top level project (#4757), [AIRFLOW-XXX] Move out the examples from integration.rst (#4672), [AIRFLOW-XXX] Extract reverse proxy info to a separate file (#4657), [AIRFLOW-XXX] Reduction of the number of warnings in the documentation (#4585), [AIRFLOW-XXX] Fix GCS Operator docstrings (#4054), [AIRFLOW-XXX] Fix Docstrings in Hooks, Sensors & Operators (#4137), [AIRFLOW-XXX] Split guide for operators to multiple files (#4814), [AIRFLOW-XXX] Split connection guide to multiple files (#4824), [AIRFLOW-XXX] Remove almost all warnings from building docs (#4588), [AIRFLOW-XXX] Add backreference in docs between operator and integration (#4671), [AIRFLOW-XXX] Improve linking to classes (#4655), [AIRFLOW-XXX] Mock optional modules when building docs (#4586), [AIRFLOW-XXX] Update plugin macros documentation (#4971), [AIRFLOW-XXX] Add missing docstring for autodetect in GCS to BQ Operator (#4979), [AIRFLOW-XXX] Add missing GCP operators to Docs (#4260), [AIRFLOW-XXX] Fixing the issue in Documentation (#3756), [AIRFLOW-XXX] Add Hint at user defined macros (#4885), [AIRFLOW-XXX] Correct schedule_interval in Scheduler docs (#4157), [AIRFLOW-XXX] Improve airflow-jira script to make RelManagers life easier (#4857), [AIRFLOW-XXX] Add missing class references to docs (#4644), [AIRFLOW-XXX] Add a doc about fab security (#4595), [AIRFLOW-XXX] Speed up DagBagTest cases (#3974). create_empty_table method accepts now table_resource parameter. This is to if you use core operators or any other. Set the logging_config_class to the filename and dict. example if the airflowignore file contained x, and the dags folder was /var/x/dags, then all dags in In previous versions, it was possible to pass DAG parsing. How many seconds to wait between file-parsing loops to prevent the logs from being spammed. Apache Airflow, Apache, Airflow, the Airflow logo, and the Apache feather logo are either registered trademarks or trademarks of The Apache Software Foundation. 'airflow.utils.log.file_task_handler.FileTaskHandler', 'airflow.utils.log.file_processor_handler.FileProcessorHandler', # When using s3 or gcs, provide a customized LOGGING_CONFIG, # in airflow_local_settings within your PYTHONPATH, see UPDATING.md. airflow-version label, please upgrade your google-cloud-dataflow or apache-beam version AIP-21. The argument has been renamed to driver_class_path and the option it [AIRFLOW-1840] Support back-compat on old celery config, [AIRFLOW-2612][AIRFLOW-2534] Clean up Hive-related tests, [AIRFLOW-2608] Implements/Standardize custom exceptions for experimental APIs, [AIRFLOW-2607] Fix failing TestLocalClient, [AIRFLOW-2638] dbapi_hook: support REPLACE INTO, [AIRFLOW-2542][AIRFLOW-1790] Rename AWS Batch Operator queue to job_queue, [AIRFLOW-2567] Extract result from the kubernetes pod as Xcom, [AIRFLOW-2601] Allow user to specify k8s config, [AIRFLOW-1786] Enforce correct behavior for soft-fail sensors, [AIRFLOW-2355] Airflow trigger tag parameters in subdag, [AIRFLOW-2613] Fix Airflow searching .zip bug, [AIRFLOW-2627] Add a sensor for Cassandra, [AIRFLOW-2634][AIRFLOW-2534] Remove dependency for impyla, [AIRFLOW-2611] Fix wrong dag volume mount path for kubernetes executor, [AIRFLOW-2562] Add Google Kubernetes Engine Operators, [AIRFLOW-2630] Fix classname in test_sql_sensor.py, [AIRFLOW-2534] Fix bug in HiveServer2Hook, [AIRFLOW-2586] Stop getting AIRFLOW_HOME value from config file in bash operator, [AIRFLOW-2605] Fix autocommit for MySqlHook, [AIRFLOW-2539][AIRFLOW-2359] Move remaining log config to configuration file, [AIRFLOW-1656] Tree view dags query changed, [AIRFLOW-2617] add imagePullPolicy config for kubernetes executor, [AIRFLOW-2429] Fix security/task/sensors/ti_deps folders flake8 error, [AIRFLOW-2550] Implements API endpoint to list DAG runs, [AIRFLOW-2512][AIRFLOW-2522] Use google-auth instead of oauth2client, [AIRFLOW-2429] Fix operators folder flake8 error, [AIRFLOW-2585] Fix several bugs in CassandraHook and CassandraToGCSOperator, [AIRFLOW-2597] Restore original dbapi.run() behavior, [AIRFLOW-2590] Fix commit in DbApiHook.run() for no-autocommit DB, [AIRFLOW-2587] Add TIMESTAMP type mapping to MySqlToHiveTransfer, [AIRFLOW-2591][AIRFLOW-2581] Set default value of autocommit to False in DbApiHook.run(), [AIRFLOW-59] Implement bulk_dump and bulk_load for the Postgres hook, [AIRFLOW-2533] Fix path to DAGs on kubernetes executor workers, [AIRFLOW-2581] RFLOW-2581] Fix DbApiHook autocommit, [AIRFLOW-2578] Add option to use proxies in JiraHook, [AIRFLOW-2575] Make gcs to gcs operator work with large files, [AIRFLOW-437] Send TI context in kill zombies, [AIRFLOW-2566] Change backfill to rerun failed tasks, [AIRFLOW-1021] Fix double login for new users with LDAP, [AIRFLOW-2573] Cast BigQuery TIMESTAMP field to float, [AIRFLOW-2560] Adding support for internalIpOnly to DataprocClusterCreateOperator, [AIRFLOW-2558] Clear task/dag is clearing all executions, [AIRFLOW-2513] Change bql to sql for BigQuery Hooks & Ops, [AIRFLOW-2545] Eliminate DeprecationWarning, [AIRFLOW-2500] Fix MySqlToHiveTransfer to transfer unsigned type properly, [AIRFLOW-2462] Change PasswordUser setter to correct syntax, [AIRFLOW-2525] Fix a bug introduced by commit dabf1b9, [AIRFLOW-2553] Add webserver.pid to .gitignore, [AIRFLOW-1863][AIRFLOW-2529] Add dag run selection widgets to gantt view, [AIRFLOW-2504] Log username correctly and add extra to search columns, [AIRFLOW-2551] Encode binary data with base64 standard rather than base64 url, [AIRFLOW-2537] Add reset-dagrun option to backfill command, [AIRFLOW-2526] dag_run.conf can override params, [AIRFLOW-2544][AIRFLOW-1967] Guard against next major release of Celery, Flower, [AIRFLOW-XXX] Add Yieldr to who is using airflow, [AIRFLOW-2547] Describe how to run tests using Docker, [AIRFLOW-2538] Update faq doc on how to reduce Airflow scheduler latency, [AIRFLOW-2529] Improve graph view performance and usability, [AIRFLOW-2517] backfill support passing key values through CLI, [AIRFLOW-2532] Support logs_volume_subpath for KubernetesExecutor, [AIRFLOW-2466] consider task_id in _change_state_for_tis_without_dagrun, [AIRFLOW-2519] Fix CeleryExecutor with SQLAlchemy, [AIRFLOW-2536] docs about how to deal with Airflow initdb failure, [AIRFLOW-2530] KubernetesOperator supports multiple clusters, [AIRFLOW-1499] Eliminate duplicate and unneeded code, [AIRFLOW-2521] backfill - make variable name and logging messages more accurate, [AIRFLOW-2429] Fix hook, macros folder flake8 error, [AIRFLOW-2525] Fix PostgresHook.copy_expert to work with COPY FROM, [AIRFLOW-2515] Add dependency on thrift_sasl to hive extra, [AIRFLOW-2523] Add how-to for managing GCP connections, [AIRFLOW-2510] Introduce new macros: prev_ds and next_ds, [AIRFLOW-1730] Unpickle value of XCom queried from DB, [AIRFLOW-2518] Fix broken ToC links in integration.rst.
- Titleist Travel Cover
- 24 Ft Round Pool Liner Beaded
- Mini Blinds Mounting Brackets
- Best Under Eye Setting Powder For Dry Skin
- Springbrunnen Fountain Pump
- Military Diorama Supplies
- Taylormade Sim Max Irons Ebay
- Walgreens Blackhead Remover
- How To Build A Self Watering Planter
- Kriega Sling Pro Messenger Bag
- Remote Advertising Internships Summer 2022
astronomer airflow upgrade
You must be concrete block molds for sale to post a comment.