[airflow] Extract AIR311 from AIR301 rules (AIR301, AIR311) (#17310)
Some checks are pending
CI / cargo clippy (push) Blocked by required conditions
CI / Determine changes (push) Waiting to run
CI / cargo fmt (push) Waiting to run
CI / cargo test (linux) (push) Blocked by required conditions
CI / cargo test (linux, release) (push) Blocked by required conditions
CI / cargo test (windows) (push) Blocked by required conditions
CI / cargo test (wasm) (push) Blocked by required conditions
CI / cargo build (release) (push) Waiting to run
CI / cargo build (msrv) (push) Blocked by required conditions
CI / cargo fuzz build (push) Blocked by required conditions
CI / fuzz parser (push) Blocked by required conditions
CI / test scripts (push) Blocked by required conditions
CI / formatter instabilities and black similarity (push) Blocked by required conditions
CI / ecosystem (push) Blocked by required conditions
CI / cargo shear (push) Blocked by required conditions
CI / python package (push) Waiting to run
CI / pre-commit (push) Waiting to run
CI / mkdocs (push) Waiting to run
CI / test ruff-lsp (push) Blocked by required conditions
CI / check playground (push) Blocked by required conditions
CI / benchmarks (push) Blocked by required conditions
[Knot Playground] Release / publish (push) Waiting to run

<!--
Thank you for contributing to Ruff! To help us out with reviewing,
please consider the following:

- Does this pull request include a summary of the change? (See below.)
- Does this pull request include a descriptive title?
- Does this pull request include references to any relevant issues?
-->

## Summary

<!-- What's the purpose of the change? What does it do, and why? -->

As discussed in
https://github.com/astral-sh/ruff/issues/14626#issuecomment-2766146129,
we're to separate suggested changes from required changes.

The following symbols have been moved to AIR311 from AIR301. They still
work in Airflow 3.0, but they're suggested to be changed as they're
expected to be removed in a future version.

* arguments
    * `airflow..DAG | dag`
        * `sla_miss_callback`
    * operators
        * `sla`
* name
* `airflow.Dataset] | [airflow.datasets.Dataset` → `airflow.sdk.Asset`
    * `airflow.datasets, rest @ ..`
        * `DatasetAlias` → `airflow.sdk.AssetAlias`
        * `DatasetAll` → `airflow.sdk.AssetAll`
        * `DatasetAny` → `airflow.sdk.AssetAny`
* `expand_alias_to_datasets` → `airflow.sdk.expand_alias_to_assets`
        * `metadata.Metadata` → `airflow.sdk.Metadata`
    <!--airflow.models.baseoperator-->
    * `airflow.models.baseoperator.chain` → `airflow.sdk.chain`
* `airflow.models.baseoperator.chain_linear` →
`airflow.sdk.chain_linear`
* `airflow.models.baseoperator.cross_downstream` →
`airflow.sdk.cross_downstream`
* `airflow.models.baseoperatorlink.BaseOperatorLink` →
`airflow.sdk.definitions.baseoperatorlink.BaseOperatorLink`
    * `airflow.timetables, rest @ ..`
* `datasets.DatasetOrTimeSchedule` → *
`airflow.timetables.assets.AssetOrTimeSchedule`
    * `airflow.utils, rest @ ..`
        <!--airflow.utils.dag_parsing_context-->
* `dag_parsing_context.get_parsing_context` →
`airflow.sdk.get_parsing_context`

## Test Plan

<!-- How was it tested? -->

The test fixture has been updated acccordingly
This commit is contained in:
Wei Lee 2025-04-16 23:06:57 +08:00 committed by GitHub
parent c7b5067ef8
commit e6a2de3ac6
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
17 changed files with 1157 additions and 967 deletions

View file

@ -22,12 +22,6 @@ DAG(dag_id="class_schedule_interval", schedule_interval="@hourly")
DAG(dag_id="class_timetable", timetable=NullTimetable())
def sla_callback(*arg, **kwargs):
pass
DAG(dag_id="class_sla_callback", sla_miss_callback=sla_callback)
DAG(dag_id="class_fail_stop", fail_stop=True)
DAG(dag_id="class_default_view", default_view="dag_default_view")
@ -53,11 +47,6 @@ def decorator_timetable():
pass
@dag(sla_miss_callback=sla_callback)
def decorator_sla_callback():
pass
@dag()
def decorator_deprecated_operator_args():
trigger_dagrun_op = trigger_dagrun.TriggerDagRunOperator(

View file

@ -9,9 +9,6 @@ from airflow import (
PY311,
PY312,
)
from airflow import (
Dataset as DatasetFromRoot,
)
from airflow.api_connexion.security import requires_access, requires_access_dataset
from airflow.auth.managers.base_auth_manager import is_authorized_dataset
from airflow.auth.managers.models.resource_details import DatasetDetails
@ -26,25 +23,16 @@ from airflow.configuration import (
set,
)
from airflow.contrib.aws_athena_hook import AWSAthenaHook
from airflow.datasets import (
Dataset,
DatasetAlias,
DatasetAliasEvent,
DatasetAll,
DatasetAny,
expand_alias_to_datasets,
)
from airflow.datasets import DatasetAliasEvent
from airflow.datasets.manager import (
DatasetManager,
dataset_manager,
resolve_dataset_manager,
)
from airflow.datasets.metadata import Metadata
from airflow.hooks.base_hook import BaseHook
from airflow.lineage.hook import DatasetLineageInfo
from airflow.listeners.spec.dataset import on_dataset_changed, on_dataset_created
from airflow.metrics.validators import AllowListValidator, BlockListValidator
from airflow.models.baseoperator import chain, chain_linear, cross_downstream
from airflow.operators.subdag import SubDagOperator
from airflow.providers.amazon.aws.auth_manager.avp.entities import AvpEntities
from airflow.providers.amazon.aws.datasets import s3
@ -61,12 +49,10 @@ from airflow.providers.trino.datasets import trino
from airflow.secrets.local_filesystem import LocalFilesystemBackend, load_connections
from airflow.security.permissions import RESOURCE_DATASET
from airflow.sensors.base_sensor_operator import BaseSensorOperator
from airflow.timetables.datasets import DatasetOrTimeSchedule
from airflow.timetables.simple import DatasetTriggeredTimetable
from airflow.triggers.external_task import TaskStateTrigger
from airflow.utils import dates
from airflow.utils.dag_cycle_tester import test_cycle
from airflow.utils.dag_parsing_context import get_parsing_context
from airflow.utils.dates import (
date_range,
datetime_to_nano,
@ -105,14 +91,9 @@ get, getboolean, getfloat, getint, has_option, remove_option, as_dict, set
# airflow.contrib.*
AWSAthenaHook()
# airflow.datasets
Dataset()
DatasetAlias()
DatasetAliasEvent()
DatasetAll()
DatasetAny()
Metadata()
expand_alias_to_datasets
# airflow.datasets.manager
DatasetManager()
@ -134,10 +115,6 @@ AllowListValidator()
BlockListValidator()
# airflow.models.baseoperator
chain, chain_linear, cross_downstream
# airflow.operators.branch_operator
BaseBranchOperator()
@ -207,7 +184,6 @@ BaseSensorOperator()
# airflow.timetables
DatasetOrTimeSchedule()
DatasetTriggeredTimetable()
# airflow.triggers.external_task
@ -231,8 +207,6 @@ dates.datetime_to_nano
# airflow.utils.dag_cycle_tester
test_cycle
# airflow.utils.dag_parsing_context
get_parsing_context
# airflow.utils.db
create_session

View file

@ -0,0 +1,28 @@
from __future__ import annotations
from datetime import timedelta
from airflow import DAG, dag
from airflow.operators.datetime import BranchDateTimeOperator
def sla_callback(*arg, **kwargs):
pass
DAG(dag_id="class_sla_callback", sla_miss_callback=sla_callback)
@dag(sla_miss_callback=sla_callback)
def decorator_sla_callback():
pass
@dag()
def decorator_deprecated_operator_args():
branch_dt_op2 = BranchDateTimeOperator(
task_id="branch_dt_op2",
sla=timedelta(seconds=10),
)
branch_dt_op2

View file

@ -0,0 +1,39 @@
from __future__ import annotations
from airflow import Dataset as DatasetFromRoot
from airflow.datasets import (
Dataset,
DatasetAlias,
DatasetAll,
DatasetAny,
expand_alias_to_datasets,
)
from airflow.datasets.metadata import Metadata
from airflow.models.baseoperator import chain, chain_linear, cross_downstream
from airflow.models.baseoperatorlink import BaseOperatorLink
from airflow.timetables.datasets import DatasetOrTimeSchedule
from airflow.utils.dag_parsing_context import get_parsing_context
DatasetFromRoot()
# airflow.datasets
Dataset()
DatasetAlias()
DatasetAll()
DatasetAny()
Metadata()
expand_alias_to_datasets()
# airflow.models.baseoperator
chain()
chain_linear()
cross_downstream()
# airflow.models.baseoperatolinker
BaseOperatorLink()
# airflow.timetables.datasets
DatasetOrTimeSchedule()
# airflow.utils.dag_parsing_context
get_parsing_context()

View file

@ -229,6 +229,9 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if checker.enabled(Rule::Airflow3Removal) {
airflow::rules::airflow_3_removal_expr(checker, expr);
}
if checker.enabled(Rule::Airflow3SuggestedUpdate) {
airflow::rules::airflow_3_0_suggested_update_expr(checker, expr);
}
if checker.enabled(Rule::Airflow3MovedToProvider) {
airflow::rules::moved_to_provider_in_3(checker, expr);
}
@ -451,6 +454,9 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if checker.enabled(Rule::Airflow3Removal) {
airflow::rules::airflow_3_removal_expr(checker, expr);
}
if checker.enabled(Rule::Airflow3SuggestedUpdate) {
airflow::rules::airflow_3_0_suggested_update_expr(checker, expr);
}
if checker.enabled(Rule::Airflow3MovedToProvider) {
airflow::rules::moved_to_provider_in_3(checker, expr);
}
@ -1152,6 +1158,9 @@ pub(crate) fn expression(expr: &Expr, checker: &Checker) {
if checker.enabled(Rule::Airflow3Removal) {
airflow::rules::airflow_3_removal_expr(checker, expr);
}
if checker.enabled(Rule::Airflow3SuggestedUpdate) {
airflow::rules::airflow_3_0_suggested_update_expr(checker, expr);
}
if checker.enabled(Rule::UnnecessaryCastToInt) {
ruff::rules::unnecessary_cast_to_int(checker, call);
}

View file

@ -1072,6 +1072,7 @@ pub fn code_to_rule(linter: Linter, code: &str) -> Option<(RuleGroup, Rule)> {
(Airflow, "002") => (RuleGroup::Preview, rules::airflow::rules::AirflowDagNoScheduleArgument),
(Airflow, "301") => (RuleGroup::Preview, rules::airflow::rules::Airflow3Removal),
(Airflow, "302") => (RuleGroup::Preview, rules::airflow::rules::Airflow3MovedToProvider),
(Airflow, "311") => (RuleGroup::Preview, rules::airflow::rules::Airflow3SuggestedUpdate),
(Airflow, "312") => (RuleGroup::Preview, rules::airflow::rules::Airflow3SuggestedToMoveToProvider),
// perflint

View file

@ -81,3 +81,45 @@ fn try_block_contains_undeprecated_import(try_node: &StmtTry, replacement: &Repl
import_searcher.visit_body(&try_node.body);
import_searcher.found_import
}
/// Check whether the segments corresponding to the fully qualified name points to a symbol that's
/// either a builtin or coming from one of the providers in Airflow.
///
/// The pattern it looks for are:
/// - `airflow.providers.**.<module>.**.*<symbol_suffix>` for providers
/// - `airflow.<module>.**.*<symbol_suffix>` for builtins
///
/// where `**` is one or more segments separated by a dot, and `*` is one or more characters.
///
/// Examples for the above patterns:
/// - `airflow.providers.google.cloud.secrets.secret_manager.CloudSecretManagerBackend` (provider)
/// - `airflow.secrets.base_secrets.BaseSecretsBackend` (builtin)
pub(crate) fn is_airflow_builtin_or_provider(
segments: &[&str],
module: &str,
symbol_suffix: &str,
) -> bool {
match segments {
["airflow", "providers", rest @ ..] => {
if let (Some(pos), Some(last_element)) =
(rest.iter().position(|&s| s == module), rest.last())
{
// Check that the module is not the last element i.e., there's a symbol that's
// being used from the `module` that ends with `symbol_suffix`.
pos + 1 < rest.len() && last_element.ends_with(symbol_suffix)
} else {
false
}
}
["airflow", first, rest @ ..] => {
if let Some(last) = rest.last() {
*first == module && last.ends_with(symbol_suffix)
} else {
false
}
}
_ => false,
}
}

View file

@ -22,6 +22,8 @@ mod tests {
#[test_case(Rule::Airflow3Removal, Path::new("AIR301_airflow_plugin.py"))]
#[test_case(Rule::Airflow3Removal, Path::new("AIR301_context.py"))]
#[test_case(Rule::Airflow3MovedToProvider, Path::new("AIR302.py"))]
#[test_case(Rule::Airflow3SuggestedUpdate, Path::new("AIR311_args.py"))]
#[test_case(Rule::Airflow3SuggestedUpdate, Path::new("AIR311_names.py"))]
#[test_case(Rule::Airflow3SuggestedToMoveToProvider, Path::new("AIR312.py"))]
fn rules(rule_code: Rule, path: &Path) -> Result<()> {
let snapshot = format!("{}_{}", rule_code.noqa_code(), path.to_string_lossy());

View file

@ -2,10 +2,12 @@ pub(crate) use dag_schedule_argument::*;
pub(crate) use moved_to_provider_in_3::*;
pub(crate) use removal_in_3::*;
pub(crate) use suggested_to_move_to_provider_in_3::*;
pub(crate) use suggested_to_update_3_0::*;
pub(crate) use task_variable_name::*;
mod dag_schedule_argument;
mod moved_to_provider_in_3;
mod removal_in_3;
mod suggested_to_move_to_provider_in_3;
mod suggested_to_update_3_0;
mod task_variable_name;

View file

@ -1,6 +1,8 @@
use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{is_guarded_by_try_except, Replacement};
use crate::rules::airflow::helpers::{
is_airflow_builtin_or_provider, is_guarded_by_try_except, Replacement,
};
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use ruff_macros::{derive_message_formats, ViolationMetadata};
use ruff_python_ast::helpers::map_callable;
@ -207,14 +209,9 @@ fn check_call_arguments(checker: &Checker, qualified_name: &QualifiedName, argum
// without replacement
checker.report_diagnostics(diagnostic_for_argument(arguments, "default_view", None));
checker.report_diagnostics(diagnostic_for_argument(arguments, "orientation", None));
checker.report_diagnostics(diagnostic_for_argument(
arguments,
"sla_miss_callback",
None,
));
}
_ => {
if is_airflow_auth_manager(qualified_name.segments()) {
segments => {
if is_airflow_auth_manager(segments) {
if !arguments.is_empty() {
checker.report_diagnostic(Diagnostic::new(
Airflow3Removal {
@ -226,20 +223,19 @@ fn check_call_arguments(checker: &Checker, qualified_name: &QualifiedName, argum
arguments.range(),
));
}
} else if is_airflow_task_handler(qualified_name.segments()) {
} else if is_airflow_task_handler(segments) {
checker.report_diagnostics(diagnostic_for_argument(
arguments,
"filename_template",
None,
));
} else if is_airflow_operator(qualified_name.segments()) {
checker.report_diagnostics(diagnostic_for_argument(arguments, "sla", None));
} else if is_airflow_builtin_or_provider(segments, "operators", "Operator") {
checker.report_diagnostics(diagnostic_for_argument(
arguments,
"task_concurrency",
Some("max_active_tis_per_dag"),
));
match qualified_name.segments() {
match segments {
["airflow", .., "operators", "trigger_dagrun", "TriggerDagRunOperator"] => {
checker.report_diagnostics(diagnostic_for_argument(
arguments,
@ -615,22 +611,8 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
"resolve_dataset_manager" => Replacement::Name("airflow.assets.resolve_asset_manager"),
_ => return,
},
["airflow", "datasets", "metadata", "Metadata"] => {
Replacement::Name("airflow.sdk.Metadata")
}
// airflow.datasets
["airflow", "Dataset"] | ["airflow", "datasets", "Dataset"] => Replacement::AutoImport {
module: "airflow.sdk",
name: "Asset",
},
["airflow", "datasets", rest] => match *rest {
"DatasetAliasEvent" => Replacement::None,
"DatasetAlias" => Replacement::Name("airflow.sdk.AssetAlias"),
"DatasetAll" => Replacement::Name("airflow.sdk.AssetAll"),
"DatasetAny" => Replacement::Name("airflow.sdk.AssetAny"),
"expand_alias_to_datasets" => Replacement::Name("airflow.sdk.expand_alias_to_assets"),
_ => return,
},
["airflow", "datasets", "DatasetAliasEvent"] => Replacement::None,
// airflow.hooks
["airflow", "hooks", "base_hook", "BaseHook"] => {
@ -665,18 +647,6 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
_ => return,
},
// airflow.models.baseoperator
["airflow", "models", "baseoperator", rest] => match *rest {
"chain" | "chain_linear" | "cross_downstream" => Replacement::SourceModuleMoved {
module: "airflow.sdk",
name: (*rest).to_string(),
},
"BaseOperatorLink" => {
Replacement::Name("airflow.sdk.definitions.baseoperatorlink.BaseOperatorLink")
}
_ => return,
},
// airflow.notifications
["airflow", "notifications", "basenotifier", "BaseNotifier"] => {
Replacement::Name("airflow.sdk.BaseNotifier")
@ -703,15 +673,9 @@ fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
}
// airflow.timetables
["airflow", "timetables", rest @ ..] => match &rest {
["datasets", "DatasetOrTimeSchedule"] => {
Replacement::Name("airflow.timetables.assets.AssetOrTimeSchedule")
}
["simple", "DatasetTriggeredTimetable"] => {
Replacement::Name("airflow.timetables.simple.AssetTriggeredTimetable")
}
_ => return,
},
["airflow", "timetables", "simple", "DatasetTriggeredTimetable"] => {
Replacement::Name("airflow.timetables.simple.AssetTriggeredTimetable")
}
// airflow.triggers
["airflow", "triggers", "external_task", "TaskStateTrigger"] => Replacement::None,
@ -980,12 +944,6 @@ fn is_airflow_hook(segments: &[&str]) -> bool {
is_airflow_builtin_or_provider(segments, "hooks", "Hook")
}
/// Check whether the symbol is coming from the `operators` builtin or provider module which ends
/// with `Operator`.
fn is_airflow_operator(segments: &[&str]) -> bool {
is_airflow_builtin_or_provider(segments, "operators", "Operator")
}
/// Check whether the symbol is coming from the `log` builtin or provider module which ends
/// with `TaskHandler`.
fn is_airflow_task_handler(segments: &[&str]) -> bool {
@ -1018,44 +976,6 @@ fn is_airflow_auth_manager(segments: &[&str]) -> bool {
}
}
/// Check whether the segments corresponding to the fully qualified name points to a symbol that's
/// either a builtin or coming from one of the providers in Airflow.
///
/// The pattern it looks for are:
/// - `airflow.providers.**.<module>.**.*<symbol_suffix>` for providers
/// - `airflow.<module>.**.*<symbol_suffix>` for builtins
///
/// where `**` is one or more segments separated by a dot, and `*` is one or more characters.
///
/// Examples for the above patterns:
/// - `airflow.providers.google.cloud.secrets.secret_manager.CloudSecretManagerBackend` (provider)
/// - `airflow.secrets.base_secrets.BaseSecretsBackend` (builtin)
fn is_airflow_builtin_or_provider(segments: &[&str], module: &str, symbol_suffix: &str) -> bool {
match segments {
["airflow", "providers", rest @ ..] => {
if let (Some(pos), Some(last_element)) =
(rest.iter().position(|&s| s == module), rest.last())
{
// Check that the module is not the last element i.e., there's a symbol that's
// being used from the `module` that ends with `symbol_suffix`.
pos + 1 < rest.len() && last_element.ends_with(symbol_suffix)
} else {
false
}
}
["airflow", first, rest @ ..] => {
if let Some(last) = rest.last() {
*first == module && last.ends_with(symbol_suffix)
} else {
false
}
}
_ => false,
}
}
/// Returns `true` if the current statement hierarchy has a function that's decorated with
/// `@airflow.decorators.task`.
fn in_airflow_task_function(semantic: &SemanticModel) -> bool {

View file

@ -0,0 +1,261 @@
use crate::checkers::ast::Checker;
use crate::importer::ImportRequest;
use crate::rules::airflow::helpers::{
is_airflow_builtin_or_provider, is_guarded_by_try_except, Replacement,
};
use ruff_diagnostics::{Diagnostic, Edit, Fix, FixAvailability, Violation};
use ruff_macros::{derive_message_formats, ViolationMetadata};
use ruff_python_ast::{name::QualifiedName, Arguments, Expr, ExprAttribute, ExprCall, ExprName};
use ruff_python_semantic::Modules;
use ruff_text_size::Ranged;
use ruff_text_size::TextRange;
/// ## What it does
/// Checks for uses of deprecated Airflow functions and values that still have
/// a compatibility layer.
///
/// ## Why is this bad?
/// Airflow 3.0 removed various deprecated functions, members, and other
/// values. Some have more modern replacements. Others are considered too niche
/// and not worth to be maintained in Airflow.
/// Even though these symbols still work fine on Airflow 3.0, they are expected to be removed in a future version.
/// The user is suggested to replace the original usage with the new ones.
///
/// ## Example
/// ```python
/// from airflow import Dataset
///
///
/// Dataset(uri="test://test/")
/// ```
///
/// Use instead:
/// ```python
/// from airflow.sdk import Asset
///
///
/// Asset(uri="test://test/")
/// ```
#[derive(ViolationMetadata)]
pub(crate) struct Airflow3SuggestedUpdate {
deprecated: String,
replacement: Replacement,
}
impl Violation for Airflow3SuggestedUpdate {
const FIX_AVAILABILITY: FixAvailability = FixAvailability::Sometimes;
#[derive_message_formats]
fn message(&self) -> String {
let Airflow3SuggestedUpdate {
deprecated,
replacement,
} = self;
match replacement {
Replacement::None
| Replacement::Name(_)
| Replacement::AutoImport { module: _, name: _ }
| Replacement::SourceModuleMoved { module: _, name: _ } => {
format!(
"`{deprecated}` is removed in Airflow 3.0; \
It still works in Airflow 3.0 but is expected to be removed in a future version."
)
}
Replacement::Message(message) => {
format!(
"`{deprecated}` is removed in Airflow 3.0; \
It still works in Airflow 3.0 but is expected to be removed in a future version.; \
{message}"
)
}
}
}
fn fix_title(&self) -> Option<String> {
let Airflow3SuggestedUpdate { replacement, .. } = self;
match replacement {
Replacement::Name(name) => Some(format!("Use `{name}` instead")),
Replacement::AutoImport { module, name } => {
Some(format!("Use `{module}.{name}` instead"))
}
Replacement::SourceModuleMoved { module, name } => {
Some(format!("Use `{module}.{name}` instead"))
}
_ => None,
}
}
}
/// AIR311
pub(crate) fn airflow_3_0_suggested_update_expr(checker: &Checker, expr: &Expr) {
if !checker.semantic().seen_module(Modules::AIRFLOW) {
return;
}
match expr {
Expr::Call(ExprCall {
func, arguments, ..
}) => {
if let Some(qualified_name) = checker.semantic().resolve_qualified_name(func) {
check_call_arguments(checker, &qualified_name, arguments);
}
}
Expr::Attribute(ExprAttribute { attr, .. }) => {
check_name(checker, expr, attr.range());
}
Expr::Name(ExprName {
id: _,
ctx: _,
range,
}) => {
check_name(checker, expr, *range);
}
_ => {}
}
}
/// Check if the `deprecated` keyword argument is being used and create a diagnostic if so along
/// with a possible `replacement`.
fn diagnostic_for_argument(
arguments: &Arguments,
deprecated: &str,
replacement: Option<&'static str>,
) -> Option<Diagnostic> {
let keyword = arguments.find_keyword(deprecated)?;
let mut diagnostic = Diagnostic::new(
Airflow3SuggestedUpdate {
deprecated: deprecated.to_string(),
replacement: match replacement {
Some(name) => Replacement::Name(name),
None => Replacement::None,
},
},
keyword
.arg
.as_ref()
.map_or_else(|| keyword.range(), Ranged::range),
);
if let Some(replacement) = replacement {
diagnostic.set_fix(Fix::safe_edit(Edit::range_replacement(
replacement.to_string(),
diagnostic.range,
)));
}
Some(diagnostic)
}
/// Check whether a removed Airflow argument is passed.
///
/// For example:
///
/// ```python
/// from airflow import DAG
///
/// DAG(sla="@daily")
/// ```
fn check_call_arguments(checker: &Checker, qualified_name: &QualifiedName, arguments: &Arguments) {
match qualified_name.segments() {
["airflow", .., "DAG" | "dag"] => {
checker.report_diagnostics(diagnostic_for_argument(
arguments,
"sla_miss_callback",
None,
));
}
segments => {
if is_airflow_builtin_or_provider(segments, "operators", "Operator") {
checker.report_diagnostics(diagnostic_for_argument(arguments, "sla", None));
}
}
}
}
/// Check whether a removed Airflow name is used.
///
/// For example:
///
/// ```python
/// from airflow import Dataset
/// from airflow import datasets
///
/// # Accessing via attribute
/// datasets.Dataset()
///
/// # Or, directly
/// Dataset()
/// ```
fn check_name(checker: &Checker, expr: &Expr, range: TextRange) {
let semantic = checker.semantic();
let Some(qualified_name) = semantic.resolve_qualified_name(expr) else {
return;
};
let replacement = match qualified_name.segments() {
// airflow.datasets.metadata
["airflow", "datasets", "metadata", "Metadata"] => {
Replacement::Name("airflow.sdk.Metadata")
}
// airflow.datasets
["airflow", "Dataset"] | ["airflow", "datasets", "Dataset"] => Replacement::AutoImport {
module: "airflow.sdk",
name: "Asset",
},
["airflow", "datasets", rest] => match *rest {
"DatasetAliasEvent" => Replacement::None,
"DatasetAlias" => Replacement::Name("airflow.sdk.AssetAlias"),
"DatasetAll" => Replacement::Name("airflow.sdk.AssetAll"),
"DatasetAny" => Replacement::Name("airflow.sdk.AssetAny"),
"expand_alias_to_datasets" => Replacement::Name("airflow.sdk.expand_alias_to_assets"),
_ => return,
},
// airflow.models.baseoperator
["airflow", "models", "baseoperator", rest] => match *rest {
"chain" | "chain_linear" | "cross_downstream" => Replacement::SourceModuleMoved {
module: "airflow.sdk",
name: (*rest).to_string(),
},
"BaseOperatorLink" => {
Replacement::Name("airflow.sdk.definitions.baseoperatorlink.BaseOperatorLink")
}
_ => return,
},
// airflow.timetables
["airflow", "timetables", "datasets", "DatasetOrTimeSchedule"] => {
Replacement::Name("airflow.timetables.assets.AssetOrTimeSchedule")
}
// airflow.utils
["airflow", "utils", "dag_parsing_context", "get_parsing_context"] => {
Replacement::Name("airflow.sdk.get_parsing_context")
}
_ => return,
};
if is_guarded_by_try_except(expr, &replacement, semantic) {
return;
}
let mut diagnostic = Diagnostic::new(
Airflow3SuggestedUpdate {
deprecated: qualified_name.to_string(),
replacement: replacement.clone(),
},
range,
);
if let Replacement::AutoImport { module, name } = replacement {
diagnostic.try_set_fix(|| {
let (import_edit, binding) = checker.importer().get_or_import_symbol(
&ImportRequest::import_from(module, name),
expr.start(),
checker.semantic(),
)?;
let replacement_edit = Edit::range_replacement(binding, range);
Ok(Fix::safe_edits(import_edit, [replacement_edit]))
});
}
checker.report_diagnostic(diagnostic);
}

View file

@ -39,256 +39,229 @@ AIR301_args.py:22:31: AIR301 [*] `timetable` is removed in Airflow 3.0
22 |+DAG(dag_id="class_timetable", schedule=NullTimetable())
23 23 |
24 24 |
25 25 | def sla_callback(*arg, **kwargs):
25 25 | DAG(dag_id="class_fail_stop", fail_stop=True)
AIR301_args.py:29:34: AIR301 `sla_miss_callback` is removed in Airflow 3.0
AIR301_args.py:25:31: AIR301 [*] `fail_stop` is removed in Airflow 3.0
|
29 | DAG(dag_id="class_sla_callback", sla_miss_callback=sla_callback)
| ^^^^^^^^^^^^^^^^^ AIR301
30 |
31 | DAG(dag_id="class_fail_stop", fail_stop=True)
|
AIR301_args.py:31:31: AIR301 [*] `fail_stop` is removed in Airflow 3.0
|
29 | DAG(dag_id="class_sla_callback", sla_miss_callback=sla_callback)
30 |
31 | DAG(dag_id="class_fail_stop", fail_stop=True)
25 | DAG(dag_id="class_fail_stop", fail_stop=True)
| ^^^^^^^^^ AIR301
32 |
33 | DAG(dag_id="class_default_view", default_view="dag_default_view")
26 |
27 | DAG(dag_id="class_default_view", default_view="dag_default_view")
|
= help: Use `fail_fast` instead
Safe fix
22 22 | DAG(dag_id="class_timetable", timetable=NullTimetable())
23 23 |
24 24 |
25 |-DAG(dag_id="class_fail_stop", fail_stop=True)
25 |+DAG(dag_id="class_fail_stop", fail_fast=True)
26 26 |
27 27 | DAG(dag_id="class_default_view", default_view="dag_default_view")
28 28 |
29 29 | DAG(dag_id="class_sla_callback", sla_miss_callback=sla_callback)
30 30 |
31 |-DAG(dag_id="class_fail_stop", fail_stop=True)
31 |+DAG(dag_id="class_fail_stop", fail_fast=True)
32 32 |
33 33 | DAG(dag_id="class_default_view", default_view="dag_default_view")
34 34 |
AIR301_args.py:33:34: AIR301 `default_view` is removed in Airflow 3.0
AIR301_args.py:27:34: AIR301 `default_view` is removed in Airflow 3.0
|
31 | DAG(dag_id="class_fail_stop", fail_stop=True)
32 |
33 | DAG(dag_id="class_default_view", default_view="dag_default_view")
25 | DAG(dag_id="class_fail_stop", fail_stop=True)
26 |
27 | DAG(dag_id="class_default_view", default_view="dag_default_view")
| ^^^^^^^^^^^^ AIR301
34 |
35 | DAG(dag_id="class_orientation", orientation="BT")
28 |
29 | DAG(dag_id="class_orientation", orientation="BT")
|
AIR301_args.py:35:33: AIR301 `orientation` is removed in Airflow 3.0
AIR301_args.py:29:33: AIR301 `orientation` is removed in Airflow 3.0
|
33 | DAG(dag_id="class_default_view", default_view="dag_default_view")
34 |
35 | DAG(dag_id="class_orientation", orientation="BT")
27 | DAG(dag_id="class_default_view", default_view="dag_default_view")
28 |
29 | DAG(dag_id="class_orientation", orientation="BT")
| ^^^^^^^^^^^ AIR301
36 |
37 | allow_future_exec_dates_dag = DAG(dag_id="class_allow_future_exec_dates")
30 |
31 | allow_future_exec_dates_dag = DAG(dag_id="class_allow_future_exec_dates")
|
AIR301_args.py:46:6: AIR301 [*] `schedule_interval` is removed in Airflow 3.0
AIR301_args.py:40:6: AIR301 [*] `schedule_interval` is removed in Airflow 3.0
|
46 | @dag(schedule_interval="0 * * * *")
40 | @dag(schedule_interval="0 * * * *")
| ^^^^^^^^^^^^^^^^^ AIR301
47 | def decorator_schedule_interval():
48 | pass
41 | def decorator_schedule_interval():
42 | pass
|
= help: Use `schedule` instead
Safe fix
43 43 | pass
44 44 |
45 45 |
46 |-@dag(schedule_interval="0 * * * *")
46 |+@dag(schedule="0 * * * *")
47 47 | def decorator_schedule_interval():
48 48 | pass
49 49 |
37 37 | pass
38 38 |
39 39 |
40 |-@dag(schedule_interval="0 * * * *")
40 |+@dag(schedule="0 * * * *")
41 41 | def decorator_schedule_interval():
42 42 | pass
43 43 |
AIR301_args.py:51:6: AIR301 [*] `timetable` is removed in Airflow 3.0
AIR301_args.py:45:6: AIR301 [*] `timetable` is removed in Airflow 3.0
|
51 | @dag(timetable=NullTimetable())
45 | @dag(timetable=NullTimetable())
| ^^^^^^^^^ AIR301
52 | def decorator_timetable():
53 | pass
46 | def decorator_timetable():
47 | pass
|
= help: Use `schedule` instead
Safe fix
48 48 | pass
49 49 |
50 50 |
51 |-@dag(timetable=NullTimetable())
51 |+@dag(schedule=NullTimetable())
52 52 | def decorator_timetable():
53 53 | pass
54 54 |
42 42 | pass
43 43 |
44 44 |
45 |-@dag(timetable=NullTimetable())
45 |+@dag(schedule=NullTimetable())
46 46 | def decorator_timetable():
47 47 | pass
48 48 |
AIR301_args.py:56:6: AIR301 `sla_miss_callback` is removed in Airflow 3.0
AIR301_args.py:53:39: AIR301 [*] `execution_date` is removed in Airflow 3.0
|
56 | @dag(sla_miss_callback=sla_callback)
| ^^^^^^^^^^^^^^^^^ AIR301
57 | def decorator_sla_callback():
58 | pass
|
AIR301_args.py:64:39: AIR301 [*] `execution_date` is removed in Airflow 3.0
|
62 | def decorator_deprecated_operator_args():
63 | trigger_dagrun_op = trigger_dagrun.TriggerDagRunOperator(
64 | task_id="trigger_dagrun_op1", execution_date="2024-12-04"
51 | def decorator_deprecated_operator_args():
52 | trigger_dagrun_op = trigger_dagrun.TriggerDagRunOperator(
53 | task_id="trigger_dagrun_op1", execution_date="2024-12-04"
| ^^^^^^^^^^^^^^ AIR301
65 | )
66 | trigger_dagrun_op2 = TriggerDagRunOperator(
54 | )
55 | trigger_dagrun_op2 = TriggerDagRunOperator(
|
= help: Use `logical_date` instead
Safe fix
61 61 | @dag()
62 62 | def decorator_deprecated_operator_args():
63 63 | trigger_dagrun_op = trigger_dagrun.TriggerDagRunOperator(
64 |- task_id="trigger_dagrun_op1", execution_date="2024-12-04"
64 |+ task_id="trigger_dagrun_op1", logical_date="2024-12-04"
65 65 | )
66 66 | trigger_dagrun_op2 = TriggerDagRunOperator(
67 67 | task_id="trigger_dagrun_op2", execution_date="2024-12-04"
50 50 | @dag()
51 51 | def decorator_deprecated_operator_args():
52 52 | trigger_dagrun_op = trigger_dagrun.TriggerDagRunOperator(
53 |- task_id="trigger_dagrun_op1", execution_date="2024-12-04"
53 |+ task_id="trigger_dagrun_op1", logical_date="2024-12-04"
54 54 | )
55 55 | trigger_dagrun_op2 = TriggerDagRunOperator(
56 56 | task_id="trigger_dagrun_op2", execution_date="2024-12-04"
AIR301_args.py:67:39: AIR301 [*] `execution_date` is removed in Airflow 3.0
AIR301_args.py:56:39: AIR301 [*] `execution_date` is removed in Airflow 3.0
|
65 | )
66 | trigger_dagrun_op2 = TriggerDagRunOperator(
67 | task_id="trigger_dagrun_op2", execution_date="2024-12-04"
54 | )
55 | trigger_dagrun_op2 = TriggerDagRunOperator(
56 | task_id="trigger_dagrun_op2", execution_date="2024-12-04"
| ^^^^^^^^^^^^^^ AIR301
68 | )
57 | )
|
= help: Use `logical_date` instead
Safe fix
64 64 | task_id="trigger_dagrun_op1", execution_date="2024-12-04"
65 65 | )
66 66 | trigger_dagrun_op2 = TriggerDagRunOperator(
67 |- task_id="trigger_dagrun_op2", execution_date="2024-12-04"
67 |+ task_id="trigger_dagrun_op2", logical_date="2024-12-04"
68 68 | )
69 69 |
70 70 | branch_dt_op = datetime.BranchDateTimeOperator(
53 53 | task_id="trigger_dagrun_op1", execution_date="2024-12-04"
54 54 | )
55 55 | trigger_dagrun_op2 = TriggerDagRunOperator(
56 |- task_id="trigger_dagrun_op2", execution_date="2024-12-04"
56 |+ task_id="trigger_dagrun_op2", logical_date="2024-12-04"
57 57 | )
58 58 |
59 59 | branch_dt_op = datetime.BranchDateTimeOperator(
AIR301_args.py:71:33: AIR301 [*] `use_task_execution_day` is removed in Airflow 3.0
AIR301_args.py:60:33: AIR301 [*] `use_task_execution_day` is removed in Airflow 3.0
|
70 | branch_dt_op = datetime.BranchDateTimeOperator(
71 | task_id="branch_dt_op", use_task_execution_day=True, task_concurrency=5
59 | branch_dt_op = datetime.BranchDateTimeOperator(
60 | task_id="branch_dt_op", use_task_execution_day=True, task_concurrency=5
| ^^^^^^^^^^^^^^^^^^^^^^ AIR301
72 | )
73 | branch_dt_op2 = BranchDateTimeOperator(
61 | )
62 | branch_dt_op2 = BranchDateTimeOperator(
|
= help: Use `use_task_logical_date` instead
Safe fix
68 68 | )
69 69 |
70 70 | branch_dt_op = datetime.BranchDateTimeOperator(
71 |- task_id="branch_dt_op", use_task_execution_day=True, task_concurrency=5
71 |+ task_id="branch_dt_op", use_task_logical_date=True, task_concurrency=5
72 72 | )
73 73 | branch_dt_op2 = BranchDateTimeOperator(
74 74 | task_id="branch_dt_op2",
57 57 | )
58 58 |
59 59 | branch_dt_op = datetime.BranchDateTimeOperator(
60 |- task_id="branch_dt_op", use_task_execution_day=True, task_concurrency=5
60 |+ task_id="branch_dt_op", use_task_logical_date=True, task_concurrency=5
61 61 | )
62 62 | branch_dt_op2 = BranchDateTimeOperator(
63 63 | task_id="branch_dt_op2",
AIR301_args.py:71:62: AIR301 [*] `task_concurrency` is removed in Airflow 3.0
AIR301_args.py:60:62: AIR301 [*] `task_concurrency` is removed in Airflow 3.0
|
70 | branch_dt_op = datetime.BranchDateTimeOperator(
71 | task_id="branch_dt_op", use_task_execution_day=True, task_concurrency=5
59 | branch_dt_op = datetime.BranchDateTimeOperator(
60 | task_id="branch_dt_op", use_task_execution_day=True, task_concurrency=5
| ^^^^^^^^^^^^^^^^ AIR301
72 | )
73 | branch_dt_op2 = BranchDateTimeOperator(
61 | )
62 | branch_dt_op2 = BranchDateTimeOperator(
|
= help: Use `max_active_tis_per_dag` instead
Safe fix
68 68 | )
69 69 |
70 70 | branch_dt_op = datetime.BranchDateTimeOperator(
71 |- task_id="branch_dt_op", use_task_execution_day=True, task_concurrency=5
71 |+ task_id="branch_dt_op", use_task_execution_day=True, max_active_tis_per_dag=5
72 72 | )
73 73 | branch_dt_op2 = BranchDateTimeOperator(
74 74 | task_id="branch_dt_op2",
57 57 | )
58 58 |
59 59 | branch_dt_op = datetime.BranchDateTimeOperator(
60 |- task_id="branch_dt_op", use_task_execution_day=True, task_concurrency=5
60 |+ task_id="branch_dt_op", use_task_execution_day=True, max_active_tis_per_dag=5
61 61 | )
62 62 | branch_dt_op2 = BranchDateTimeOperator(
63 63 | task_id="branch_dt_op2",
AIR301_args.py:75:9: AIR301 [*] `use_task_execution_day` is removed in Airflow 3.0
AIR301_args.py:64:9: AIR301 [*] `use_task_execution_day` is removed in Airflow 3.0
|
73 | branch_dt_op2 = BranchDateTimeOperator(
74 | task_id="branch_dt_op2",
75 | use_task_execution_day=True,
62 | branch_dt_op2 = BranchDateTimeOperator(
63 | task_id="branch_dt_op2",
64 | use_task_execution_day=True,
| ^^^^^^^^^^^^^^^^^^^^^^ AIR301
76 | sla=timedelta(seconds=10),
77 | )
65 | sla=timedelta(seconds=10),
66 | )
|
= help: Use `use_task_logical_date` instead
Safe fix
72 72 | )
73 73 | branch_dt_op2 = BranchDateTimeOperator(
74 74 | task_id="branch_dt_op2",
75 |- use_task_execution_day=True,
75 |+ use_task_logical_date=True,
76 76 | sla=timedelta(seconds=10),
77 77 | )
78 78 |
61 61 | )
62 62 | branch_dt_op2 = BranchDateTimeOperator(
63 63 | task_id="branch_dt_op2",
64 |- use_task_execution_day=True,
64 |+ use_task_logical_date=True,
65 65 | sla=timedelta(seconds=10),
66 66 | )
67 67 |
AIR301_args.py:76:9: AIR301 `sla` is removed in Airflow 3.0
AIR301_args.py:87:15: AIR301 `filename_template` is removed in Airflow 3.0
|
74 | task_id="branch_dt_op2",
75 | use_task_execution_day=True,
76 | sla=timedelta(seconds=10),
| ^^^ AIR301
77 | )
86 | # deprecated filename_template argument in FileTaskHandler
87 | S3TaskHandler(filename_template="/tmp/test")
| ^^^^^^^^^^^^^^^^^ AIR301
88 | HdfsTaskHandler(filename_template="/tmp/test")
89 | ElasticsearchTaskHandler(filename_template="/tmp/test")
|
AIR301_args.py:98:15: AIR301 `filename_template` is removed in Airflow 3.0
|
97 | # deprecated filename_template argument in FileTaskHandler
98 | S3TaskHandler(filename_template="/tmp/test")
| ^^^^^^^^^^^^^^^^^ AIR301
99 | HdfsTaskHandler(filename_template="/tmp/test")
100 | ElasticsearchTaskHandler(filename_template="/tmp/test")
|
AIR301_args.py:88:17: AIR301 `filename_template` is removed in Airflow 3.0
|
86 | # deprecated filename_template argument in FileTaskHandler
87 | S3TaskHandler(filename_template="/tmp/test")
88 | HdfsTaskHandler(filename_template="/tmp/test")
| ^^^^^^^^^^^^^^^^^ AIR301
89 | ElasticsearchTaskHandler(filename_template="/tmp/test")
90 | GCSTaskHandler(filename_template="/tmp/test")
|
AIR301_args.py:99:17: AIR301 `filename_template` is removed in Airflow 3.0
|
97 | # deprecated filename_template argument in FileTaskHandler
98 | S3TaskHandler(filename_template="/tmp/test")
99 | HdfsTaskHandler(filename_template="/tmp/test")
| ^^^^^^^^^^^^^^^^^ AIR301
100 | ElasticsearchTaskHandler(filename_template="/tmp/test")
101 | GCSTaskHandler(filename_template="/tmp/test")
|
AIR301_args.py:89:26: AIR301 `filename_template` is removed in Airflow 3.0
|
87 | S3TaskHandler(filename_template="/tmp/test")
88 | HdfsTaskHandler(filename_template="/tmp/test")
89 | ElasticsearchTaskHandler(filename_template="/tmp/test")
| ^^^^^^^^^^^^^^^^^ AIR301
90 | GCSTaskHandler(filename_template="/tmp/test")
|
AIR301_args.py:100:26: AIR301 `filename_template` is removed in Airflow 3.0
|
98 | S3TaskHandler(filename_template="/tmp/test")
99 | HdfsTaskHandler(filename_template="/tmp/test")
100 | ElasticsearchTaskHandler(filename_template="/tmp/test")
| ^^^^^^^^^^^^^^^^^ AIR301
101 | GCSTaskHandler(filename_template="/tmp/test")
|
AIR301_args.py:90:16: AIR301 `filename_template` is removed in Airflow 3.0
|
88 | HdfsTaskHandler(filename_template="/tmp/test")
89 | ElasticsearchTaskHandler(filename_template="/tmp/test")
90 | GCSTaskHandler(filename_template="/tmp/test")
| ^^^^^^^^^^^^^^^^^ AIR301
91 |
92 | FabAuthManager(None)
|
AIR301_args.py:101:16: AIR301 `filename_template` is removed in Airflow 3.0
|
99 | HdfsTaskHandler(filename_template="/tmp/test")
100 | ElasticsearchTaskHandler(filename_template="/tmp/test")
101 | GCSTaskHandler(filename_template="/tmp/test")
| ^^^^^^^^^^^^^^^^^ AIR301
102 |
103 | FabAuthManager(None)
|
AIR301_args.py:103:15: AIR301 `appbuilder` is removed in Airflow 3.0; The constructor takes no parameter now
|
101 | GCSTaskHandler(filename_template="/tmp/test")
102 |
103 | FabAuthManager(None)
| ^^^^^^ AIR301
|
AIR301_args.py:92:15: AIR301 `appbuilder` is removed in Airflow 3.0; The constructor takes no parameter now
|
90 | GCSTaskHandler(filename_template="/tmp/test")
91 |
92 | FabAuthManager(None)
| ^^^^^^ AIR301
|

View file

@ -1,29 +1,6 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR301_class_attribute.py:24:21: AIR301 [*] `airflow.Dataset` is removed in Airflow 3.0
|
23 | # airflow.Dataset
24 | dataset_from_root = DatasetFromRoot()
| ^^^^^^^^^^^^^^^ AIR301
25 | dataset_from_root.iter_datasets()
26 | dataset_from_root.iter_dataset_aliases()
|
= help: Use `airflow.sdk.Asset` instead
Safe fix
19 19 | from airflow.providers_manager import ProvidersManager
20 20 | from airflow.secrets.base_secrets import BaseSecretsBackend
21 21 | from airflow.secrets.local_filesystem import LocalFilesystemBackend
22 |+from airflow.sdk import Asset
22 23 |
23 24 | # airflow.Dataset
24 |-dataset_from_root = DatasetFromRoot()
25 |+dataset_from_root = Asset()
25 26 | dataset_from_root.iter_datasets()
26 27 | dataset_from_root.iter_dataset_aliases()
27 28 |
AIR301_class_attribute.py:25:19: AIR301 [*] `iter_datasets` is removed in Airflow 3.0
|
23 | # airflow.Dataset
@ -65,34 +42,6 @@ AIR301_class_attribute.py:26:19: AIR301 [*] `iter_dataset_aliases` is removed in
28 28 | # airflow.datasets
29 29 | dataset_to_test_method_call = Dataset()
AIR301_class_attribute.py:29:31: AIR301 [*] `airflow.datasets.Dataset` is removed in Airflow 3.0
|
28 | # airflow.datasets
29 | dataset_to_test_method_call = Dataset()
| ^^^^^^^ AIR301
30 | dataset_to_test_method_call.iter_datasets()
31 | dataset_to_test_method_call.iter_dataset_aliases()
|
= help: Use `airflow.sdk.Asset` instead
Safe fix
19 19 | from airflow.providers_manager import ProvidersManager
20 20 | from airflow.secrets.base_secrets import BaseSecretsBackend
21 21 | from airflow.secrets.local_filesystem import LocalFilesystemBackend
22 |+from airflow.sdk import Asset
22 23 |
23 24 | # airflow.Dataset
24 25 | dataset_from_root = DatasetFromRoot()
--------------------------------------------------------------------------------
26 27 | dataset_from_root.iter_dataset_aliases()
27 28 |
28 29 | # airflow.datasets
29 |-dataset_to_test_method_call = Dataset()
30 |+dataset_to_test_method_call = Asset()
30 31 | dataset_to_test_method_call.iter_datasets()
31 32 | dataset_to_test_method_call.iter_dataset_aliases()
32 33 |
AIR301_class_attribute.py:30:29: AIR301 [*] `iter_datasets` is removed in Airflow 3.0
|
28 | # airflow.datasets
@ -134,17 +83,6 @@ AIR301_class_attribute.py:31:29: AIR301 [*] `iter_dataset_aliases` is removed in
33 33 | alias_to_test_method_call = DatasetAlias()
34 34 | alias_to_test_method_call.iter_datasets()
AIR301_class_attribute.py:33:29: AIR301 `airflow.datasets.DatasetAlias` is removed in Airflow 3.0
|
31 | dataset_to_test_method_call.iter_dataset_aliases()
32 |
33 | alias_to_test_method_call = DatasetAlias()
| ^^^^^^^^^^^^ AIR301
34 | alias_to_test_method_call.iter_datasets()
35 | alias_to_test_method_call.iter_dataset_aliases()
|
= help: Use `airflow.sdk.AssetAlias` instead
AIR301_class_attribute.py:34:27: AIR301 [*] `iter_datasets` is removed in Airflow 3.0
|
33 | alias_to_test_method_call = DatasetAlias()
@ -185,17 +123,6 @@ AIR301_class_attribute.py:35:27: AIR301 [*] `iter_dataset_aliases` is removed in
37 37 | any_to_test_method_call = DatasetAny()
38 38 | any_to_test_method_call.iter_datasets()
AIR301_class_attribute.py:37:27: AIR301 `airflow.datasets.DatasetAny` is removed in Airflow 3.0
|
35 | alias_to_test_method_call.iter_dataset_aliases()
36 |
37 | any_to_test_method_call = DatasetAny()
| ^^^^^^^^^^ AIR301
38 | any_to_test_method_call.iter_datasets()
39 | any_to_test_method_call.iter_dataset_aliases()
|
= help: Use `airflow.sdk.AssetAny` instead
AIR301_class_attribute.py:38:25: AIR301 [*] `iter_datasets` is removed in Airflow 3.0
|
37 | any_to_test_method_call = DatasetAny()

View file

@ -0,0 +1,25 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR311_args.py:13:34: AIR311 `sla_miss_callback` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
13 | DAG(dag_id="class_sla_callback", sla_miss_callback=sla_callback)
| ^^^^^^^^^^^^^^^^^ AIR311
|
AIR311_args.py:16:6: AIR311 `sla_miss_callback` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
16 | @dag(sla_miss_callback=sla_callback)
| ^^^^^^^^^^^^^^^^^ AIR311
17 | def decorator_sla_callback():
18 | pass
|
AIR311_args.py:25:9: AIR311 `sla` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
23 | branch_dt_op2 = BranchDateTimeOperator(
24 | task_id="branch_dt_op2",
25 | sla=timedelta(seconds=10),
| ^^^ AIR311
26 | )
|

View file

@ -0,0 +1,153 @@
---
source: crates/ruff_linter/src/rules/airflow/mod.rs
---
AIR311_names.py:17:1: AIR311 [*] `airflow.Dataset` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
15 | from airflow.utils.dag_parsing_context import get_parsing_context
16 |
17 | DatasetFromRoot()
| ^^^^^^^^^^^^^^^ AIR311
18 |
19 | # airflow.datasets
|
= help: Use `airflow.sdk.Asset` instead
Safe fix
13 13 | from airflow.models.baseoperatorlink import BaseOperatorLink
14 14 | from airflow.timetables.datasets import DatasetOrTimeSchedule
15 15 | from airflow.utils.dag_parsing_context import get_parsing_context
16 |+from airflow.sdk import Asset
16 17 |
17 |-DatasetFromRoot()
18 |+Asset()
18 19 |
19 20 | # airflow.datasets
20 21 | Dataset()
AIR311_names.py:20:1: AIR311 [*] `airflow.datasets.Dataset` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
19 | # airflow.datasets
20 | Dataset()
| ^^^^^^^ AIR311
21 | DatasetAlias()
22 | DatasetAll()
|
= help: Use `airflow.sdk.Asset` instead
Safe fix
13 13 | from airflow.models.baseoperatorlink import BaseOperatorLink
14 14 | from airflow.timetables.datasets import DatasetOrTimeSchedule
15 15 | from airflow.utils.dag_parsing_context import get_parsing_context
16 |+from airflow.sdk import Asset
16 17 |
17 18 | DatasetFromRoot()
18 19 |
19 20 | # airflow.datasets
20 |-Dataset()
21 |+Asset()
21 22 | DatasetAlias()
22 23 | DatasetAll()
23 24 | DatasetAny()
AIR311_names.py:21:1: AIR311 `airflow.datasets.DatasetAlias` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
19 | # airflow.datasets
20 | Dataset()
21 | DatasetAlias()
| ^^^^^^^^^^^^ AIR311
22 | DatasetAll()
23 | DatasetAny()
|
= help: Use `airflow.sdk.AssetAlias` instead
AIR311_names.py:22:1: AIR311 `airflow.datasets.DatasetAll` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
20 | Dataset()
21 | DatasetAlias()
22 | DatasetAll()
| ^^^^^^^^^^ AIR311
23 | DatasetAny()
24 | Metadata()
|
= help: Use `airflow.sdk.AssetAll` instead
AIR311_names.py:23:1: AIR311 `airflow.datasets.DatasetAny` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
21 | DatasetAlias()
22 | DatasetAll()
23 | DatasetAny()
| ^^^^^^^^^^ AIR311
24 | Metadata()
25 | expand_alias_to_datasets()
|
= help: Use `airflow.sdk.AssetAny` instead
AIR311_names.py:24:1: AIR311 `airflow.datasets.metadata.Metadata` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
22 | DatasetAll()
23 | DatasetAny()
24 | Metadata()
| ^^^^^^^^ AIR311
25 | expand_alias_to_datasets()
|
= help: Use `airflow.sdk.Metadata` instead
AIR311_names.py:25:1: AIR311 `airflow.datasets.expand_alias_to_datasets` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
23 | DatasetAny()
24 | Metadata()
25 | expand_alias_to_datasets()
| ^^^^^^^^^^^^^^^^^^^^^^^^ AIR311
26 |
27 | # airflow.models.baseoperator
|
= help: Use `airflow.sdk.expand_alias_to_assets` instead
AIR311_names.py:28:1: AIR311 `airflow.models.baseoperator.chain` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
27 | # airflow.models.baseoperator
28 | chain()
| ^^^^^ AIR311
29 | chain_linear()
30 | cross_downstream()
|
= help: Use `airflow.sdk.chain` instead
AIR311_names.py:29:1: AIR311 `airflow.models.baseoperator.chain_linear` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
27 | # airflow.models.baseoperator
28 | chain()
29 | chain_linear()
| ^^^^^^^^^^^^ AIR311
30 | cross_downstream()
|
= help: Use `airflow.sdk.chain_linear` instead
AIR311_names.py:30:1: AIR311 `airflow.models.baseoperator.cross_downstream` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
28 | chain()
29 | chain_linear()
30 | cross_downstream()
| ^^^^^^^^^^^^^^^^ AIR311
31 |
32 | # airflow.models.baseoperatolinker
|
= help: Use `airflow.sdk.cross_downstream` instead
AIR311_names.py:36:1: AIR311 `airflow.timetables.datasets.DatasetOrTimeSchedule` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
35 | # airflow.timetables.datasets
36 | DatasetOrTimeSchedule()
| ^^^^^^^^^^^^^^^^^^^^^ AIR311
37 |
38 | # airflow.utils.dag_parsing_context
|
= help: Use `airflow.timetables.assets.AssetOrTimeSchedule` instead
AIR311_names.py:39:1: AIR311 `airflow.utils.dag_parsing_context.get_parsing_context` is removed in Airflow 3.0; It still works in Airflow 3.0 but is expected to be removed in a future version.
|
38 | # airflow.utils.dag_parsing_context
39 | get_parsing_context()
| ^^^^^^^^^^^^^^^^^^^ AIR311
|
= help: Use `airflow.sdk.get_parsing_context` instead