Maintain synchronicity between the lexer and the parser (#11457)

## Summary

This PR updates the entire parser stack in multiple ways:

### Make the lexer lazy

* https://github.com/astral-sh/ruff/pull/11244
* https://github.com/astral-sh/ruff/pull/11473

Previously, Ruff's lexer would act as an iterator. The parser would
collect all the tokens in a vector first and then process the tokens to
create the syntax tree.

The first task in this project is to update the entire parsing flow to
make the lexer lazy. This includes the `Lexer`, `TokenSource`, and
`Parser`. For context, the `TokenSource` is a wrapper around the `Lexer`
to filter out the trivia tokens[^1]. Now, the parser will ask the token
source to get the next token and only then the lexer will continue and
emit the token. This means that the lexer needs to be aware of the
"current" token. When the `next_token` is called, the current token will
be updated with the newly lexed token.

The main motivation to make the lexer lazy is to allow re-lexing a token
in a different context. This is going to be really useful to make the
parser error resilience. For example, currently the emitted tokens
remains the same even if the parser can recover from an unclosed
parenthesis. This is important because the lexer emits a
`NonLogicalNewline` in parenthesized context while a normal `Newline` in
non-parenthesized context. This different kinds of newline is also used
to emit the indentation tokens which is important for the parser as it's
used to determine the start and end of a block.

Additionally, this allows us to implement the following functionalities:
1. Checkpoint - rewind infrastructure: The idea here is to create a
checkpoint and continue lexing. At a later point, this checkpoint can be
used to rewind the lexer back to the provided checkpoint.
2. Remove the `SoftKeywordTransformer` and instead use lookahead or
speculative parsing to determine whether a soft keyword is a keyword or
an identifier
3. Remove the `Tok` enum. The `Tok` enum represents the tokens emitted
by the lexer but it contains owned data which makes it expensive to
clone. The new `TokenKind` enum just represents the type of token which
is very cheap.

This brings up a question as to how will the parser get the owned value
which was stored on `Tok`. This will be solved by introducing a new
`TokenValue` enum which only contains a subset of token kinds which has
the owned value. This is stored on the lexer and is requested by the
parser when it wants to process the data. For example:
8196720f80/crates/ruff_python_parser/src/parser/expression.rs (L1260-L1262)

[^1]: Trivia tokens are `NonLogicalNewline` and `Comment`

### Remove `SoftKeywordTransformer`

* https://github.com/astral-sh/ruff/pull/11441
* https://github.com/astral-sh/ruff/pull/11459
* https://github.com/astral-sh/ruff/pull/11442
* https://github.com/astral-sh/ruff/pull/11443
* https://github.com/astral-sh/ruff/pull/11474

For context,
https://github.com/RustPython/RustPython/pull/4519/files#diff-5de40045e78e794aa5ab0b8aacf531aa477daf826d31ca129467703855408220
added support for soft keywords in the parser which uses infinite
lookahead to classify a soft keyword as a keyword or an identifier. This
is a brilliant idea as it basically wraps the existing Lexer and works
on top of it which means that the logic for lexing and re-lexing a soft
keyword remains separate. The change here is to remove
`SoftKeywordTransformer` and let the parser determine this based on
context, lookahead and speculative parsing.

* **Context:** The transformer needs to know the position of the lexer
between it being at a statement position or a simple statement position.
This is because a `match` token starts a compound statement while a
`type` token starts a simple statement. **The parser already knows
this.**
* **Lookahead:** Now that the parser knows the context it can perform
lookahead of up to two tokens to classify the soft keyword. The logic
for this is mentioned in the PR implementing it for `type` and `match
soft keyword.
* **Speculative parsing:** This is where the checkpoint - rewind
infrastructure helps. For `match` soft keyword, there are certain cases
for which we can't classify based on lookahead. The idea here is to
create a checkpoint and keep parsing. Based on whether the parsing was
successful and what tokens are ahead we can classify the remaining
cases. Refer to #11443 for more details.

If the soft keyword is being parsed in an identifier context, it'll be
converted to an identifier and the emitted token will be updated as
well. Refer
8196720f80/crates/ruff_python_parser/src/parser/expression.rs (L487-L491).

The `case` soft keyword doesn't require any special handling because
it'll be a keyword only in the context of a match statement.

### Update the parser API

* https://github.com/astral-sh/ruff/pull/11494
* https://github.com/astral-sh/ruff/pull/11505

Now that the lexer is in sync with the parser, and the parser helps to
determine whether a soft keyword is a keyword or an identifier, the
lexer cannot be used on its own. The reason being that it's not
sensitive to the context (which is correct). This means that the parser
API needs to be updated to not allow any access to the lexer.

Previously, there were multiple ways to parse the source code:
1. Passing the source code itself
2. Or, passing the tokens

Now that the lexer and parser are working together, the API
corresponding to (2) cannot exists. The final API is mentioned in this
PR description: https://github.com/astral-sh/ruff/pull/11494.

### Refactor the downstream tools (linter and formatter)

* https://github.com/astral-sh/ruff/pull/11511
* https://github.com/astral-sh/ruff/pull/11515
* https://github.com/astral-sh/ruff/pull/11529
* https://github.com/astral-sh/ruff/pull/11562
* https://github.com/astral-sh/ruff/pull/11592

And, the final set of changes involves updating all references of the
lexer and `Tok` enum. This was done in two-parts:
1. Update all the references in a way that doesn't require any changes
from this PR i.e., it can be done independently
	* https://github.com/astral-sh/ruff/pull/11402
	* https://github.com/astral-sh/ruff/pull/11406
	* https://github.com/astral-sh/ruff/pull/11418
	* https://github.com/astral-sh/ruff/pull/11419
	* https://github.com/astral-sh/ruff/pull/11420
	* https://github.com/astral-sh/ruff/pull/11424
2. Update all the remaining references to use the changes made in this
PR

For (2), there were various strategies used:
1. Introduce a new `Tokens` struct which wraps the token vector and add
methods to query a certain subset of tokens. These includes:
	1. `up_to_first_unknown` which replaces the `tokenize` function
2. `in_range` and `after` which replaces the `lex_starts_at` function
where the former returns the tokens within the given range while the
latter returns all the tokens after the given offset
2. Introduce a new `TokenFlags` which is a set of flags to query certain
information from a token. Currently, this information is only limited to
any string type token but can be expanded to include other information
in the future as needed. https://github.com/astral-sh/ruff/pull/11578
3. Move the `CommentRanges` to the parsed output because this
information is common to both the linter and the formatter. This removes
the need for `tokens_and_ranges` function.

## Test Plan

- [x] Update and verify the test snapshots
- [x] Make sure the entire test suite is passing
- [x] Make sure there are no changes in the ecosystem checks
- [x] Run the fuzzer on the parser
- [x] Run this change on dozens of open-source projects

### Running this change on dozens of open-source projects

Refer to the PR description to get the list of open source projects used
for testing.

Now, the following tests were done between `main` and this branch:
1. Compare the output of `--select=E999` (syntax errors)
2. Compare the output of default rule selection
3. Compare the output of `--select=ALL`

**Conclusion: all output were same**

## What's next?

The next step is to introduce re-lexing logic and update the parser to
feed the recovery information to the lexer so that it can emit the
correct token. This moves us one step closer to having error resilience
in the parser and provides Ruff the possibility to lint even if the
source code contains syntax errors.
This commit is contained in:
Dhruv Manilawala 2024-06-03 18:23:50 +05:30 committed by GitHub
parent c69a789aa5
commit bf5b62edac
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
262 changed files with 8174 additions and 6132 deletions

View file

@ -2,11 +2,13 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
Name {
name: "a_variable",
},
Name(
"a_variable",
),
0..10,
),
(
@ -14,9 +16,9 @@ expression: lex_source(source)
11..12,
),
(
Int {
value: 99,
},
Int(
99,
),
13..15,
),
(
@ -24,9 +26,9 @@ expression: lex_source(source)
16..17,
),
(
Int {
value: 2,
},
Int(
2,
),
18..19,
),
(
@ -34,9 +36,9 @@ expression: lex_source(source)
19..20,
),
(
Int {
value: 0,
},
Int(
0,
),
20..21,
),
(
@ -44,3 +46,4 @@ expression: lex_source(source)
21..21,
),
]
```

View file

@ -2,17 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: comment_until_eol(MAC_EOL)
---
## Tokens
```
[
(
Int {
value: 123,
},
Int(
123,
),
0..3,
),
(
Comment(
"# Foo",
),
Comment,
5..10,
),
(
@ -20,9 +20,9 @@ expression: comment_until_eol(MAC_EOL)
10..11,
),
(
Int {
value: 456,
},
Int(
456,
),
11..14,
),
(
@ -30,3 +30,4 @@ expression: comment_until_eol(MAC_EOL)
14..14,
),
]
```

View file

@ -2,17 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: comment_until_eol(UNIX_EOL)
---
## Tokens
```
[
(
Int {
value: 123,
},
Int(
123,
),
0..3,
),
(
Comment(
"# Foo",
),
Comment,
5..10,
),
(
@ -20,9 +20,9 @@ expression: comment_until_eol(UNIX_EOL)
10..11,
),
(
Int {
value: 456,
},
Int(
456,
),
11..14,
),
(
@ -30,3 +30,4 @@ expression: comment_until_eol(UNIX_EOL)
14..14,
),
]
```

View file

@ -2,17 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: comment_until_eol(WINDOWS_EOL)
---
## Tokens
```
[
(
Int {
value: 123,
},
Int(
123,
),
0..3,
),
(
Comment(
"# Foo",
),
Comment,
5..10,
),
(
@ -20,9 +20,9 @@ expression: comment_until_eol(WINDOWS_EOL)
10..12,
),
(
Int {
value: 456,
},
Int(
456,
),
12..15,
),
(
@ -30,3 +30,4 @@ expression: comment_until_eol(WINDOWS_EOL)
15..15,
),
]
```

View file

@ -0,0 +1,79 @@
---
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
If,
0..2,
),
(
Name(
"first",
),
3..8,
),
(
Colon,
8..9,
),
(
Newline,
9..10,
),
(
Indent,
10..14,
),
(
If,
14..16,
),
(
Name(
"second",
),
17..23,
),
(
Colon,
23..24,
),
(
Newline,
24..25,
),
(
Indent,
25..33,
),
(
Pass,
33..37,
),
(
Newline,
37..38,
),
(
Dedent,
42..42,
),
(
Name(
"foo",
),
42..45,
),
(
Newline,
45..46,
),
(
Dedent,
46..46,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: double_dedent_with_eol(MAC_EOL)
---
## Tokens
```
[
(
Def,
0..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -38,9 +40,9 @@ expression: double_dedent_with_eol(MAC_EOL)
12..14,
),
(
Name {
name: "x",
},
Name(
"x",
),
15..16,
),
(
@ -64,9 +66,9 @@ expression: double_dedent_with_eol(MAC_EOL)
21..27,
),
(
Int {
value: 99,
},
Int(
99,
),
28..30,
),
(
@ -86,3 +88,4 @@ expression: double_dedent_with_eol(MAC_EOL)
32..32,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: double_dedent_with_tabs_eol(MAC_EOL)
---
## Tokens
```
[
(
Def,
0..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -38,9 +40,9 @@ expression: double_dedent_with_tabs_eol(MAC_EOL)
12..14,
),
(
Name {
name: "x",
},
Name(
"x",
),
15..16,
),
(
@ -64,9 +66,9 @@ expression: double_dedent_with_tabs_eol(MAC_EOL)
22..28,
),
(
Int {
value: 99,
},
Int(
99,
),
29..31,
),
(
@ -86,3 +88,4 @@ expression: double_dedent_with_tabs_eol(MAC_EOL)
33..33,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: double_dedent_with_tabs_eol(UNIX_EOL)
---
## Tokens
```
[
(
Def,
0..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -38,9 +40,9 @@ expression: double_dedent_with_tabs_eol(UNIX_EOL)
12..14,
),
(
Name {
name: "x",
},
Name(
"x",
),
15..16,
),
(
@ -64,9 +66,9 @@ expression: double_dedent_with_tabs_eol(UNIX_EOL)
22..28,
),
(
Int {
value: 99,
},
Int(
99,
),
29..31,
),
(
@ -86,3 +88,4 @@ expression: double_dedent_with_tabs_eol(UNIX_EOL)
33..33,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: double_dedent_with_tabs_eol(WINDOWS_EOL)
---
## Tokens
```
[
(
Def,
0..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -38,9 +40,9 @@ expression: double_dedent_with_tabs_eol(WINDOWS_EOL)
13..15,
),
(
Name {
name: "x",
},
Name(
"x",
),
16..17,
),
(
@ -64,9 +66,9 @@ expression: double_dedent_with_tabs_eol(WINDOWS_EOL)
25..31,
),
(
Int {
value: 99,
},
Int(
99,
),
32..34,
),
(
@ -86,3 +88,4 @@ expression: double_dedent_with_tabs_eol(WINDOWS_EOL)
38..38,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: double_dedent_with_eol(UNIX_EOL)
---
## Tokens
```
[
(
Def,
0..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -38,9 +40,9 @@ expression: double_dedent_with_eol(UNIX_EOL)
12..14,
),
(
Name {
name: "x",
},
Name(
"x",
),
15..16,
),
(
@ -64,9 +66,9 @@ expression: double_dedent_with_eol(UNIX_EOL)
21..27,
),
(
Int {
value: 99,
},
Int(
99,
),
28..30,
),
(
@ -86,3 +88,4 @@ expression: double_dedent_with_eol(UNIX_EOL)
32..32,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: double_dedent_with_eol(WINDOWS_EOL)
---
## Tokens
```
[
(
Def,
0..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -38,9 +40,9 @@ expression: double_dedent_with_eol(WINDOWS_EOL)
13..15,
),
(
Name {
name: "x",
},
Name(
"x",
),
16..17,
),
(
@ -64,9 +66,9 @@ expression: double_dedent_with_eol(WINDOWS_EOL)
24..30,
),
(
Int {
value: 99,
},
Int(
99,
),
31..33,
),
(
@ -86,3 +88,4 @@ expression: double_dedent_with_eol(WINDOWS_EOL)
37..37,
),
]
```

View file

@ -0,0 +1,24 @@
---
source: crates/ruff_python_parser/src/lexer.rs
expression: "lex_invalid(source, Mode::Module)"
---
## Tokens
```
[
(
Unknown,
0..4,
),
]
```
## Errors
```
[
LexicalError {
error: UnrecognizedToken {
tok: '🐦',
},
location: 0..4,
},
]
```

View file

@ -2,115 +2,97 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
2..3,
),
(
String {
value: "",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Double,
},
},
4..6,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
String(
"",
),
4..6,
TokenFlags(
DOUBLE_QUOTES,
),
),
(
FStringStart,
7..9,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
9..10,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
),
FStringStart,
11..13,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
13..14,
TokenFlags(
F_STRING,
),
),
(
String {
value: "",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"",
),
15..17,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Double,
},
),
FStringStart,
18..22,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringEnd,
22..25,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
),
FStringStart,
26..30,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringEnd,
30..33,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Newline,
33..33,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_jupyter_source(source)
---
## Tokens
```
[
(
IpyEscapeCommand {
@ -103,3 +105,4 @@ expression: lex_jupyter_source(source)
20..20,
),
]
```

View file

@ -2,22 +2,21 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
String {
value: "\\N{EN SPACE}",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Double,
},
},
String(
"\\N{EN SPACE}",
),
0..14,
TokenFlags(
DOUBLE_QUOTES,
),
),
(
Newline,
14..14,
),
]
```

View file

@ -2,40 +2,33 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle {
value: "normal ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"normal ",
),
2..9,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
9..10,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
10..13,
),
(
@ -43,26 +36,22 @@ expression: lex_source(source)
13..14,
),
(
FStringMiddle {
value: " {another} ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" {another} ",
),
14..27,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
27..28,
),
(
Name {
name: "bar",
},
Name(
"bar",
),
28..31,
),
(
@ -70,26 +59,22 @@ expression: lex_source(source)
31..32,
),
(
FStringMiddle {
value: " {",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" {",
),
32..35,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
35..36,
),
(
Name {
name: "three",
},
Name(
"three",
),
36..41,
),
(
@ -97,24 +82,24 @@ expression: lex_source(source)
41..42,
),
(
FStringMiddle {
value: "}",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"}",
),
42..44,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
44..45,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
45..45,
),
]
```

View file

@ -2,40 +2,31 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Double,
},
),
FStringStart,
0..4,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringMiddle {
value: "\n# not a comment ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Double,
},
},
FStringMiddle(
"\n# not a comment ",
),
4..21,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Lbrace,
21..22,
),
(
Comment(
"# comment {",
),
Comment,
23..34,
),
(
@ -43,9 +34,9 @@ expression: lex_source(source)
34..35,
),
(
Name {
name: "x",
},
Name(
"x",
),
39..40,
),
(
@ -57,24 +48,24 @@ expression: lex_source(source)
41..42,
),
(
FStringMiddle {
value: " # not a comment\n",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Double,
},
},
FStringMiddle(
" # not a comment\n",
),
42..59,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringEnd,
59..62,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Newline,
62..62,
),
]
```

View file

@ -2,27 +2,24 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
2..3,
),
(
Name {
name: "x",
},
Name(
"x",
),
3..4,
),
(
@ -30,9 +27,9 @@ expression: lex_source(source)
4..5,
),
(
Name {
name: "s",
},
Name(
"s",
),
5..6,
),
(
@ -40,26 +37,22 @@ expression: lex_source(source)
6..7,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" ",
),
7..8,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
8..9,
),
(
Name {
name: "x",
},
Name(
"x",
),
9..10,
),
(
@ -71,9 +64,9 @@ expression: lex_source(source)
11..12,
),
(
Name {
name: "r",
},
Name(
"r",
),
12..13,
),
(
@ -81,26 +74,22 @@ expression: lex_source(source)
13..14,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" ",
),
14..15,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
15..16,
),
(
Name {
name: "x",
},
Name(
"x",
),
16..17,
),
(
@ -108,41 +97,37 @@ expression: lex_source(source)
17..18,
),
(
FStringMiddle {
value: ".3f!r",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
".3f!r",
),
18..23,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Rbrace,
23..24,
),
(
FStringMiddle {
value: " {x!r}",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" {x!r}",
),
24..32,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
32..33,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
33..33,
),
]
```

View file

@ -2,40 +2,33 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle {
value: "\\",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"\\",
),
2..3,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
3..4,
),
(
Name {
name: "x",
},
Name(
"x",
),
4..5,
),
(
@ -43,26 +36,22 @@ expression: lex_source(source)
5..6,
),
(
FStringMiddle {
value: "\\\"\\",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"\\\"\\",
),
6..9,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
9..10,
),
(
Name {
name: "x",
},
Name(
"x",
),
10..11,
),
(
@ -74,24 +63,24 @@ expression: lex_source(source)
12..13,
),
(
FStringMiddle {
value: " \\\"\\\"\\\n end",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" \\\"\\\"\\\n end",
),
13..24,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
24..25,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
25..25,
),
]
```

View file

@ -2,40 +2,33 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
),
FStringStart,
0..2,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle {
value: "\\",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"\\",
),
2..3,
TokenFlags(
F_STRING,
),
),
(
Lbrace,
3..4,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -45,40 +38,34 @@ expression: lex_source(source)
(
FStringEnd,
8..9,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
TokenFlags(
F_STRING,
),
10..12,
),
(
FStringMiddle {
value: "\\\\",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringStart,
10..12,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle(
"\\\\",
),
12..14,
TokenFlags(
F_STRING,
),
),
(
Lbrace,
14..15,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
15..18,
),
(
@ -88,67 +75,59 @@ expression: lex_source(source)
(
FStringEnd,
19..20,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
TokenFlags(
F_STRING,
),
21..23,
),
(
FStringMiddle {
value: "\\{foo}",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringStart,
21..23,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle(
"\\{foo}",
),
23..31,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
31..32,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
TokenFlags(
F_STRING,
),
33..35,
),
(
FStringMiddle {
value: "\\\\{foo}",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringStart,
33..35,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle(
"\\\\{foo}",
),
35..44,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
44..45,
TokenFlags(
F_STRING,
),
),
(
Newline,
45..45,
),
]
```

View file

@ -2,44 +2,33 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..3,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringMiddle {
value: "\\",
flags: AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"\\",
),
3..4,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
Lbrace,
4..5,
),
(
Name {
name: "x",
},
Name(
"x",
),
5..6,
),
(
@ -47,28 +36,22 @@ expression: lex_source(source)
6..7,
),
(
FStringMiddle {
value: "\\\"\\",
flags: AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"\\\"\\",
),
7..10,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
Lbrace,
10..11,
),
(
Name {
name: "x",
},
Name(
"x",
),
11..12,
),
(
@ -80,26 +63,24 @@ expression: lex_source(source)
13..14,
),
(
FStringMiddle {
value: " \\\"\\\"\\\n end",
flags: AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" \\\"\\\"\\\n end",
),
14..25,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringEnd,
25..26,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
Newline,
26..26,
),
]
```

View file

@ -2,31 +2,24 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle {
value: "first ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"first ",
),
2..8,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -37,9 +30,9 @@ expression: lex_source(source)
9..10,
),
(
Name {
name: "x",
},
Name(
"x",
),
14..15,
),
(
@ -55,9 +48,9 @@ expression: lex_source(source)
25..26,
),
(
Name {
name: "y",
},
Name(
"y",
),
38..39,
),
(
@ -69,24 +62,24 @@ expression: lex_source(source)
40..41,
),
(
FStringMiddle {
value: " second",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" second",
),
41..48,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
48..49,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
49..49,
),
]
```

View file

@ -2,127 +2,99 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Double,
},
),
FStringStart,
0..4,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringMiddle {
value: "\nhello\n world\n",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Double,
},
},
FStringMiddle(
"\nhello\n world\n",
),
4..21,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringEnd,
21..24,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
25..29,
),
(
FStringMiddle {
value: "\n world\nhello\n",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
},
FStringStart,
25..29,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringMiddle(
"\n world\nhello\n",
),
29..46,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringEnd,
46..49,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
50..52,
),
(
FStringMiddle {
value: "some ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringStart,
50..52,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle(
"some ",
),
52..57,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
57..58,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Double,
},
),
FStringStart,
58..62,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringMiddle {
value: "multiline\nallowed ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Double,
},
},
FStringMiddle(
"multiline\nallowed ",
),
62..80,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Lbrace,
80..81,
),
(
Name {
name: "x",
},
Name(
"x",
),
81..82,
),
(
@ -132,30 +104,33 @@ expression: lex_source(source)
(
FStringEnd,
83..86,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Rbrace,
86..87,
),
(
FStringMiddle {
value: " string",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" string",
),
87..94,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
94..95,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
95..95,
),
]
```

View file

@ -2,38 +2,35 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle {
value: "\\N{BULLET} normal \\Nope \\N",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"\\N{BULLET} normal \\Nope \\N",
),
2..28,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
28..29,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
29..29,
),
]
```

View file

@ -2,44 +2,33 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..3,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringMiddle {
value: "\\N",
flags: AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"\\N",
),
3..5,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
Lbrace,
5..6,
),
(
Name {
name: "BULLET",
},
Name(
"BULLET",
),
6..12,
),
(
@ -47,26 +36,24 @@ expression: lex_source(source)
12..13,
),
(
FStringMiddle {
value: " normal",
flags: AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" normal",
),
13..20,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringEnd,
20..21,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
Newline,
21..21,
),
]
```

View file

@ -2,69 +2,53 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle {
value: "foo ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"foo ",
),
2..6,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
6..7,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
7..9,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle {
value: "bar ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"bar ",
),
9..13,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
13..14,
),
(
Name {
name: "x",
},
Name(
"x",
),
14..15,
),
(
@ -72,25 +56,20 @@ expression: lex_source(source)
16..17,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
18..20,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
20..21,
),
(
Name {
name: "wow",
},
Name(
"wow",
),
21..24,
),
(
@ -100,6 +79,9 @@ expression: lex_source(source)
(
FStringEnd,
25..26,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Rbrace,
@ -108,135 +90,112 @@ expression: lex_source(source)
(
FStringEnd,
27..28,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Rbrace,
28..29,
),
(
FStringMiddle {
value: " baz",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" baz",
),
29..33,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
33..34,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
35..37,
),
(
FStringMiddle {
value: "foo ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringStart,
35..37,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle(
"foo ",
),
37..41,
TokenFlags(
F_STRING,
),
),
(
Lbrace,
41..42,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
),
FStringStart,
42..44,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle {
value: "bar",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"bar",
),
44..47,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
47..48,
TokenFlags(
F_STRING,
),
),
(
Rbrace,
48..49,
),
(
FStringMiddle {
value: " some ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
" some ",
),
49..55,
TokenFlags(
F_STRING,
),
),
(
Lbrace,
55..56,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
56..58,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle {
value: "another",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"another",
),
58..65,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
65..66,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Rbrace,
@ -245,9 +204,13 @@ expression: lex_source(source)
(
FStringEnd,
67..68,
TokenFlags(
F_STRING,
),
),
(
Newline,
68..68,
),
]
```

View file

@ -2,18 +2,15 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -26,60 +23,48 @@ expression: lex_source(source)
(
FStringEnd,
4..5,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
6..8,
),
(
FStringMiddle {
value: "{}",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringStart,
6..8,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle(
"{}",
),
8..12,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
12..13,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
14..16,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringStart,
14..16,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle(
" ",
),
16..17,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -92,31 +77,25 @@ expression: lex_source(source)
(
FStringEnd,
19..20,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
21..23,
),
(
FStringMiddle {
value: "{",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringStart,
21..23,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle(
"{",
),
23..25,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -127,75 +106,59 @@ expression: lex_source(source)
26..27,
),
(
FStringMiddle {
value: "}",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"}",
),
27..29,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
29..30,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
31..33,
),
(
FStringMiddle {
value: "{{}}",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringStart,
31..33,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle(
"{{}}",
),
33..41,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
41..42,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
43..45,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringStart,
43..45,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle(
" ",
),
45..46,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -206,17 +169,13 @@ expression: lex_source(source)
47..48,
),
(
FStringMiddle {
value: " {} {",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" {} {",
),
48..56,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -227,24 +186,24 @@ expression: lex_source(source)
57..58,
),
(
FStringMiddle {
value: "} {{}} ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"} {{}} ",
),
58..71,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
71..72,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
72..72,
),
]
```

View file

@ -2,185 +2,152 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
2..3,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
4..6,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
6..7,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
8..11,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringEnd,
11..12,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
13..16,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringEnd,
16..17,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: true,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
18..21,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_UPPERCASE,
),
),
(
FStringEnd,
21..22,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_UPPERCASE,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: true,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
23..26,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_UPPERCASE,
),
),
(
FStringEnd,
26..27,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_UPPERCASE,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
28..31,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringEnd,
31..32,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: false,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
33..36,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringEnd,
36..37,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_LOWERCASE,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: true,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
38..41,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_UPPERCASE,
),
),
(
FStringEnd,
41..42,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_UPPERCASE,
),
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Raw {
uppercase_r: true,
},
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
43..46,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_UPPERCASE,
),
),
(
FStringEnd,
46..47,
TokenFlags(
DOUBLE_QUOTES | F_STRING | RAW_STRING_UPPERCASE,
),
),
(
Newline,
47..47,
),
]
```

View file

@ -2,38 +2,35 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: fstring_single_quote_escape_eol(MAC_EOL)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
),
FStringStart,
0..2,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle {
value: "text \\\r more text",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"text \\\r more text",
),
2..19,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
19..20,
TokenFlags(
F_STRING,
),
),
(
Newline,
20..20,
),
]
```

View file

@ -2,38 +2,35 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: fstring_single_quote_escape_eol(UNIX_EOL)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
),
FStringStart,
0..2,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle {
value: "text \\\n more text",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"text \\\n more text",
),
2..19,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
19..20,
TokenFlags(
F_STRING,
),
),
(
Newline,
20..20,
),
]
```

View file

@ -2,38 +2,35 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: fstring_single_quote_escape_eol(WINDOWS_EOL)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
),
FStringStart,
0..2,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle {
value: "text \\\r\n more text",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"text \\\r\n more text",
),
2..20,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
20..21,
TokenFlags(
F_STRING,
),
),
(
Newline,
21..21,
),
]
```

View file

@ -2,27 +2,24 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
2..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
3..6,
),
(
@ -34,26 +31,22 @@ expression: lex_source(source)
7..8,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" ",
),
8..9,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
9..10,
),
(
Name {
name: "x",
},
Name(
"x",
),
10..11,
),
(
@ -65,9 +58,9 @@ expression: lex_source(source)
12..13,
),
(
Name {
name: "s",
},
Name(
"s",
),
13..14,
),
(
@ -75,43 +68,35 @@ expression: lex_source(source)
14..15,
),
(
FStringMiddle {
value: ".3f",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
".3f",
),
15..18,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Rbrace,
18..19,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" ",
),
19..20,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
20..21,
),
(
Name {
name: "x",
},
Name(
"x",
),
21..22,
),
(
@ -119,26 +104,22 @@ expression: lex_source(source)
22..23,
),
(
FStringMiddle {
value: ".",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
".",
),
23..24,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
24..25,
),
(
Name {
name: "y",
},
Name(
"y",
),
25..26,
),
(
@ -146,50 +127,35 @@ expression: lex_source(source)
26..27,
),
(
FStringMiddle {
value: "f",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"f",
),
27..28,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Rbrace,
28..29,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" ",
),
29..30,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
30..31,
),
(
String {
value: "",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"",
),
31..33,
),
(
@ -197,26 +163,22 @@ expression: lex_source(source)
33..34,
),
(
FStringMiddle {
value: "*^",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"*^",
),
34..36,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
36..37,
),
(
Int {
value: 1,
},
Int(
1,
),
37..38,
),
(
@ -228,9 +190,9 @@ expression: lex_source(source)
39..40,
),
(
Int {
value: 1,
},
Int(
1,
),
40..41,
),
(
@ -246,26 +208,22 @@ expression: lex_source(source)
43..44,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" ",
),
44..45,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
45..46,
),
(
Name {
name: "x",
},
Name(
"x",
),
46..47,
),
(
@ -281,9 +239,9 @@ expression: lex_source(source)
49..50,
),
(
Int {
value: 1,
},
Int(
1,
),
50..51,
),
(
@ -295,9 +253,9 @@ expression: lex_source(source)
52..53,
),
(
Name {
name: "pop",
},
Name(
"pop",
),
53..56,
),
(
@ -319,9 +277,13 @@ expression: lex_source(source)
(
FStringEnd,
60..61,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
61..61,
),
]
```

View file

@ -2,31 +2,24 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringMiddle {
value: "foo ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"foo ",
),
2..6,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -37,9 +30,9 @@ expression: lex_source(source)
7..8,
),
(
Name {
name: "pwd",
},
Name(
"pwd",
),
8..11,
),
(
@ -47,24 +40,24 @@ expression: lex_source(source)
11..12,
),
(
FStringMiddle {
value: " bar",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" bar",
),
12..16,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
FStringEnd,
16..17,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
17..17,
),
]
```

View file

@ -2,18 +2,15 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -24,9 +21,9 @@ expression: lex_source(source)
3..9,
),
(
Name {
name: "x",
},
Name(
"x",
),
10..11,
),
(
@ -38,9 +35,9 @@ expression: lex_source(source)
12..13,
),
(
Name {
name: "x",
},
Name(
"x",
),
13..14,
),
(
@ -54,22 +51,20 @@ expression: lex_source(source)
(
FStringEnd,
16..17,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
17..18,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
18..20,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -84,9 +79,9 @@ expression: lex_source(source)
22..28,
),
(
Name {
name: "x",
},
Name(
"x",
),
29..30,
),
(
@ -98,9 +93,9 @@ expression: lex_source(source)
31..32,
),
(
Name {
name: "x",
},
Name(
"x",
),
32..33,
),
(
@ -118,9 +113,13 @@ expression: lex_source(source)
(
FStringEnd,
36..37,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
37..37,
),
]
```

View file

@ -2,31 +2,24 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
),
FStringStart,
0..4,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringMiddle {
value: "__",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
},
FStringMiddle(
"__",
),
4..6,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Lbrace,
@ -37,9 +30,9 @@ expression: lex_source(source)
7..8,
),
(
Name {
name: "x",
},
Name(
"x",
),
12..13,
),
(
@ -47,67 +40,53 @@ expression: lex_source(source)
13..14,
),
(
FStringMiddle {
value: "d\n",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
},
FStringMiddle(
"d\n",
),
14..16,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Rbrace,
16..17,
),
(
FStringMiddle {
value: "__",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
},
FStringMiddle(
"__",
),
17..19,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringEnd,
19..22,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Newline,
22..23,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
),
FStringStart,
23..27,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringMiddle {
value: "__",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
},
FStringMiddle(
"__",
),
27..29,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Lbrace,
@ -118,9 +97,9 @@ expression: lex_source(source)
30..31,
),
(
Name {
name: "x",
},
Name(
"x",
),
35..36,
),
(
@ -128,67 +107,53 @@ expression: lex_source(source)
36..37,
),
(
FStringMiddle {
value: "a\n b\n c\n",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
},
FStringMiddle(
"a\n b\n c\n",
),
37..61,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Rbrace,
61..62,
),
(
FStringMiddle {
value: "__",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: true,
quote_style: Single,
},
},
FStringMiddle(
"__",
),
62..64,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
FStringEnd,
64..67,
TokenFlags(
TRIPLE_QUOTED_STRING | F_STRING,
),
),
(
Newline,
67..68,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
),
FStringStart,
68..70,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle {
value: "__",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"__",
),
70..72,
TokenFlags(
F_STRING,
),
),
(
Lbrace,
@ -199,9 +164,9 @@ expression: lex_source(source)
73..74,
),
(
Name {
name: "x",
},
Name(
"x",
),
78..79,
),
(
@ -209,17 +174,13 @@ expression: lex_source(source)
79..80,
),
(
FStringMiddle {
value: "d",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"d",
),
80..81,
TokenFlags(
F_STRING,
),
),
(
NonLogicalNewline,
@ -230,50 +191,40 @@ expression: lex_source(source)
82..83,
),
(
FStringMiddle {
value: "__",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"__",
),
83..85,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
85..86,
TokenFlags(
F_STRING,
),
),
(
Newline,
86..87,
),
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
),
FStringStart,
87..89,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle {
value: "__",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"__",
),
89..91,
TokenFlags(
F_STRING,
),
),
(
Lbrace,
@ -284,9 +235,9 @@ expression: lex_source(source)
92..93,
),
(
Name {
name: "x",
},
Name(
"x",
),
97..98,
),
(
@ -294,26 +245,22 @@ expression: lex_source(source)
98..99,
),
(
FStringMiddle {
value: "a",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"a",
),
99..100,
TokenFlags(
F_STRING,
),
),
(
NonLogicalNewline,
100..101,
),
(
Name {
name: "b",
},
Name(
"b",
),
109..110,
),
(
@ -325,24 +272,24 @@ expression: lex_source(source)
111..112,
),
(
FStringMiddle {
value: "__",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"__",
),
112..114,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
114..115,
TokenFlags(
F_STRING,
),
),
(
Newline,
115..116,
),
]
```

View file

@ -2,27 +2,24 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
),
FStringStart,
0..2,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
2..3,
),
(
Name {
name: "x",
},
Name(
"x",
),
3..4,
),
(
@ -30,34 +27,26 @@ expression: lex_source(source)
4..5,
),
(
FStringMiddle {
value: "=10",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
"=10",
),
5..8,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Rbrace,
8..9,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" ",
),
9..10,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -68,9 +57,9 @@ expression: lex_source(source)
11..12,
),
(
Name {
name: "x",
},
Name(
"x",
),
12..13,
),
(
@ -78,9 +67,9 @@ expression: lex_source(source)
13..15,
),
(
Int {
value: 10,
},
Int(
10,
),
15..17,
),
(
@ -92,26 +81,22 @@ expression: lex_source(source)
18..19,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" ",
),
19..20,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
20..21,
),
(
Name {
name: "x",
},
Name(
"x",
),
21..22,
),
(
@ -123,9 +108,9 @@ expression: lex_source(source)
23..24,
),
(
Name {
name: "y",
},
Name(
"y",
),
24..25,
),
(
@ -133,9 +118,9 @@ expression: lex_source(source)
25..27,
),
(
Int {
value: 10,
},
Int(
10,
),
27..29,
),
(
@ -147,17 +132,13 @@ expression: lex_source(source)
30..31,
),
(
FStringMiddle {
value: " ",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Double,
},
},
FStringMiddle(
" ",
),
31..32,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Lbrace,
@ -168,9 +149,9 @@ expression: lex_source(source)
33..34,
),
(
Name {
name: "x",
},
Name(
"x",
),
34..35,
),
(
@ -178,9 +159,9 @@ expression: lex_source(source)
35..37,
),
(
Int {
value: 10,
},
Int(
10,
),
37..39,
),
(
@ -194,9 +175,13 @@ expression: lex_source(source)
(
FStringEnd,
41..42,
TokenFlags(
DOUBLE_QUOTES | F_STRING,
),
),
(
Newline,
42..42,
),
]
```

View file

@ -2,38 +2,35 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
FStringStart(
AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
),
FStringStart,
0..2,
TokenFlags(
F_STRING,
),
),
(
FStringMiddle {
value: "\\0",
flags: AnyStringFlags {
prefix: Format(
Regular,
),
triple_quoted: false,
quote_style: Single,
},
},
FStringMiddle(
"\\0",
),
2..4,
TokenFlags(
F_STRING,
),
),
(
FStringEnd,
4..5,
TokenFlags(
F_STRING,
),
),
(
Newline,
5..5,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: indentation_with_eol(MAC_EOL)
---
## Tokens
```
[
(
Def,
0..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -38,9 +40,9 @@ expression: indentation_with_eol(MAC_EOL)
15..21,
),
(
Int {
value: 99,
},
Int(
99,
),
22..24,
),
(
@ -56,3 +58,4 @@ expression: indentation_with_eol(MAC_EOL)
26..26,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: indentation_with_eol(UNIX_EOL)
---
## Tokens
```
[
(
Def,
0..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -38,9 +40,9 @@ expression: indentation_with_eol(UNIX_EOL)
15..21,
),
(
Int {
value: 99,
},
Int(
99,
),
22..24,
),
(
@ -56,3 +58,4 @@ expression: indentation_with_eol(UNIX_EOL)
26..26,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: indentation_with_eol(WINDOWS_EOL)
---
## Tokens
```
[
(
Def,
0..3,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
4..7,
),
(
@ -38,9 +40,9 @@ expression: indentation_with_eol(WINDOWS_EOL)
16..22,
),
(
Int {
value: 99,
},
Int(
99,
),
23..25,
),
(
@ -56,3 +58,4 @@ expression: indentation_with_eol(WINDOWS_EOL)
29..29,
),
]
```

View file

@ -1,12 +1,28 @@
---
source: crates/ruff_python_parser/src/lexer.rs
expression: tokens
expression: "lex_invalid(source, Mode::Module)"
---
Err(
## Tokens
```
[
(
Unknown,
0..85,
),
(
Newline,
85..85,
),
]
```
## Errors
```
[
LexicalError {
error: OtherError(
"Invalid decimal integer literal",
),
location: 0..85,
},
)
]
```

View file

@ -1,12 +1,28 @@
---
source: crates/ruff_python_parser/src/lexer.rs
expression: tokens
expression: "lex_invalid(source, Mode::Module)"
---
Err(
## Tokens
```
[
(
Unknown,
0..3,
),
(
Newline,
3..3,
),
]
```
## Errors
```
[
LexicalError {
error: OtherError(
"Invalid decimal integer literal",
),
location: 0..3,
},
)
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_jupyter_source(source)
---
## Tokens
```
[
(
IpyEscapeCommand {
@ -125,3 +127,4 @@ expression: lex_jupyter_source(source)
180..180,
),
]
```

View file

@ -2,11 +2,13 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_jupyter_source(source)
---
## Tokens
```
[
(
Name {
name: "pwd",
},
Name(
"pwd",
),
0..3,
),
(
@ -25,9 +27,9 @@ expression: lex_jupyter_source(source)
10..11,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
11..14,
),
(
@ -46,9 +48,9 @@ expression: lex_jupyter_source(source)
30..31,
),
(
Name {
name: "bar",
},
Name(
"bar",
),
31..34,
),
(
@ -67,9 +69,9 @@ expression: lex_jupyter_source(source)
50..51,
),
(
Name {
name: "baz",
},
Name(
"baz",
),
51..54,
),
(
@ -88,3 +90,4 @@ expression: lex_jupyter_source(source)
85..85,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_jupyter_source(source)
---
## Tokens
```
[
(
If,
@ -39,3 +41,4 @@ expression: lex_jupyter_source(source)
43..43,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: ipython_escape_command_line_continuation_eol(MAC_EOL)
---
## Tokens
```
[
(
IpyEscapeCommand {
@ -15,3 +17,4 @@ expression: ipython_escape_command_line_continuation_eol(MAC_EOL)
24..24,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: ipython_escape_command_line_continuation_eol(UNIX_EOL)
---
## Tokens
```
[
(
IpyEscapeCommand {
@ -15,3 +17,4 @@ expression: ipython_escape_command_line_continuation_eol(UNIX_EOL)
24..24,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: ipython_escape_command_line_continuation_eol(WINDOWS_EOL)
---
## Tokens
```
[
(
IpyEscapeCommand {
@ -15,3 +17,4 @@ expression: ipython_escape_command_line_continuation_eol(WINDOWS_EOL)
25..25,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: ipython_escape_command_line_continuation_with_eol_and_eof(MAC_EOL)
---
## Tokens
```
[
(
IpyEscapeCommand {
@ -15,3 +17,4 @@ expression: ipython_escape_command_line_continuation_with_eol_and_eof(MAC_EOL)
14..14,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: ipython_escape_command_line_continuation_with_eol_and_eof(UNIX_EOL)
---
## Tokens
```
[
(
IpyEscapeCommand {
@ -15,3 +17,4 @@ expression: ipython_escape_command_line_continuation_with_eol_and_eof(UNIX_EOL)
14..14,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: ipython_escape_command_line_continuation_with_eol_and_eof(WINDOWS_EOL)
---
## Tokens
```
[
(
IpyEscapeCommand {
@ -15,3 +17,4 @@ expression: ipython_escape_command_line_continuation_with_eol_and_eof(WINDOWS_EO
15..15,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_jupyter_source(source)
---
## Tokens
```
[
(
IpyEscapeCommand {
@ -180,3 +182,4 @@ expression: lex_jupyter_source(source)
132..132,
),
]
```

View file

@ -2,17 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(&source)
---
## Tokens
```
[
(
Int {
value: 99232,
},
Int(
99232,
),
0..5,
),
(
Comment(
"#",
),
Comment,
7..8,
),
(
@ -20,3 +20,4 @@ expression: lex_source(&source)
8..8,
),
]
```

View file

@ -2,17 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(&source)
---
## Tokens
```
[
(
Int {
value: 99232,
},
Int(
99232,
),
0..5,
),
(
Comment(
"# foo",
),
Comment,
7..12,
),
(
@ -20,3 +20,4 @@ expression: lex_source(&source)
12..12,
),
]
```

View file

@ -2,17 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(&source)
---
## Tokens
```
[
(
Int {
value: 99232,
},
Int(
99232,
),
0..5,
),
(
Comment(
"# ",
),
Comment,
7..9,
),
(
@ -20,3 +20,4 @@ expression: lex_source(&source)
9..9,
),
]
```

View file

@ -2,17 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(&source)
---
## Tokens
```
[
(
Int {
value: 99232,
},
Int(
99232,
),
0..5,
),
(
Comment(
"# ",
),
Comment,
7..10,
),
(
@ -20,3 +20,4 @@ expression: lex_source(&source)
10..10,
),
]
```

View file

@ -2,11 +2,11 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
Comment(
"#Hello",
),
Comment,
0..6,
),
(
@ -14,9 +14,7 @@ expression: lex_source(source)
6..7,
),
(
Comment(
"#World",
),
Comment,
7..13,
),
(
@ -24,3 +22,4 @@ expression: lex_source(source)
13..14,
),
]
```

View file

@ -2,15 +2,17 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_jupyter_source(source)
---
## Tokens
```
[
(
Match,
0..5,
),
(
Name {
name: "foo",
},
Name(
"foo",
),
6..9,
),
(
@ -30,9 +32,9 @@ expression: lex_jupyter_source(source)
15..19,
),
(
Name {
name: "bar",
},
Name(
"bar",
),
20..23,
),
(
@ -64,3 +66,4 @@ expression: lex_jupyter_source(source)
37..37,
),
]
```

View file

@ -2,11 +2,13 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: newline_in_brackets_eol(MAC_EOL)
---
## Tokens
```
[
(
Name {
name: "x",
},
Name(
"x",
),
0..1,
),
(
@ -26,9 +28,9 @@ expression: newline_in_brackets_eol(MAC_EOL)
6..7,
),
(
Int {
value: 1,
},
Int(
1,
),
11..12,
),
(
@ -36,9 +38,9 @@ expression: newline_in_brackets_eol(MAC_EOL)
12..13,
),
(
Int {
value: 2,
},
Int(
2,
),
13..14,
),
(
@ -54,9 +56,9 @@ expression: newline_in_brackets_eol(MAC_EOL)
16..17,
),
(
Int {
value: 3,
},
Int(
3,
),
17..18,
),
(
@ -68,9 +70,9 @@ expression: newline_in_brackets_eol(MAC_EOL)
19..20,
),
(
Int {
value: 4,
},
Int(
4,
),
20..21,
),
(
@ -98,9 +100,9 @@ expression: newline_in_brackets_eol(MAC_EOL)
27..28,
),
(
Int {
value: 5,
},
Int(
5,
),
28..29,
),
(
@ -112,9 +114,9 @@ expression: newline_in_brackets_eol(MAC_EOL)
30..31,
),
(
Int {
value: 6,
},
Int(
6,
),
31..32,
),
(
@ -122,9 +124,9 @@ expression: newline_in_brackets_eol(MAC_EOL)
32..33,
),
(
Int {
value: 7,
},
Int(
7,
),
35..36,
),
(
@ -140,3 +142,4 @@ expression: newline_in_brackets_eol(MAC_EOL)
38..39,
),
]
```

View file

@ -2,11 +2,13 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: newline_in_brackets_eol(UNIX_EOL)
---
## Tokens
```
[
(
Name {
name: "x",
},
Name(
"x",
),
0..1,
),
(
@ -26,9 +28,9 @@ expression: newline_in_brackets_eol(UNIX_EOL)
6..7,
),
(
Int {
value: 1,
},
Int(
1,
),
11..12,
),
(
@ -36,9 +38,9 @@ expression: newline_in_brackets_eol(UNIX_EOL)
12..13,
),
(
Int {
value: 2,
},
Int(
2,
),
13..14,
),
(
@ -54,9 +56,9 @@ expression: newline_in_brackets_eol(UNIX_EOL)
16..17,
),
(
Int {
value: 3,
},
Int(
3,
),
17..18,
),
(
@ -68,9 +70,9 @@ expression: newline_in_brackets_eol(UNIX_EOL)
19..20,
),
(
Int {
value: 4,
},
Int(
4,
),
20..21,
),
(
@ -98,9 +100,9 @@ expression: newline_in_brackets_eol(UNIX_EOL)
27..28,
),
(
Int {
value: 5,
},
Int(
5,
),
28..29,
),
(
@ -112,9 +114,9 @@ expression: newline_in_brackets_eol(UNIX_EOL)
30..31,
),
(
Int {
value: 6,
},
Int(
6,
),
31..32,
),
(
@ -122,9 +124,9 @@ expression: newline_in_brackets_eol(UNIX_EOL)
32..33,
),
(
Int {
value: 7,
},
Int(
7,
),
35..36,
),
(
@ -140,3 +142,4 @@ expression: newline_in_brackets_eol(UNIX_EOL)
38..39,
),
]
```

View file

@ -2,11 +2,13 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: newline_in_brackets_eol(WINDOWS_EOL)
---
## Tokens
```
[
(
Name {
name: "x",
},
Name(
"x",
),
0..1,
),
(
@ -26,9 +28,9 @@ expression: newline_in_brackets_eol(WINDOWS_EOL)
7..9,
),
(
Int {
value: 1,
},
Int(
1,
),
13..14,
),
(
@ -36,9 +38,9 @@ expression: newline_in_brackets_eol(WINDOWS_EOL)
14..15,
),
(
Int {
value: 2,
},
Int(
2,
),
15..16,
),
(
@ -54,9 +56,9 @@ expression: newline_in_brackets_eol(WINDOWS_EOL)
19..20,
),
(
Int {
value: 3,
},
Int(
3,
),
20..21,
),
(
@ -68,9 +70,9 @@ expression: newline_in_brackets_eol(WINDOWS_EOL)
22..24,
),
(
Int {
value: 4,
},
Int(
4,
),
24..25,
),
(
@ -98,9 +100,9 @@ expression: newline_in_brackets_eol(WINDOWS_EOL)
32..34,
),
(
Int {
value: 5,
},
Int(
5,
),
34..35,
),
(
@ -112,9 +114,9 @@ expression: newline_in_brackets_eol(WINDOWS_EOL)
36..38,
),
(
Int {
value: 6,
},
Int(
6,
),
38..39,
),
(
@ -122,9 +124,9 @@ expression: newline_in_brackets_eol(WINDOWS_EOL)
39..40,
),
(
Int {
value: 7,
},
Int(
7,
),
43..44,
),
(
@ -140,3 +142,4 @@ expression: newline_in_brackets_eol(WINDOWS_EOL)
46..48,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
Lpar,
@ -12,16 +14,9 @@ expression: lex_source(source)
1..2,
),
(
String {
value: "a",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"a",
),
6..9,
),
(
@ -29,16 +24,9 @@ expression: lex_source(source)
9..10,
),
(
String {
value: "b",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"b",
),
14..17,
),
(
@ -50,29 +38,15 @@ expression: lex_source(source)
18..19,
),
(
String {
value: "c",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"c",
),
23..26,
),
(
String {
value: "d",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"d",
),
33..36,
),
(
@ -88,3 +62,4 @@ expression: lex_source(source)
38..38,
),
]
```

View file

@ -2,59 +2,61 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
Int {
value: 47,
},
Int(
47,
),
0..4,
),
(
Int {
value: 10,
},
Int(
10,
),
5..9,
),
(
Int {
value: 13,
},
Int(
13,
),
10..16,
),
(
Int {
value: 0,
},
Int(
0,
),
17..18,
),
(
Int {
value: 123,
},
Int(
123,
),
19..22,
),
(
Int {
value: 1234567890,
},
Int(
1234567890,
),
23..36,
),
(
Float {
value: 0.2,
},
Float(
0.2,
),
37..40,
),
(
Float {
value: 100.0,
},
Float(
100.0,
),
41..45,
),
(
Float {
value: 2100.0,
},
Float(
2100.0,
),
46..51,
),
(
@ -72,21 +74,21 @@ expression: lex_source(source)
55..59,
),
(
Int {
value: 0,
},
Int(
0,
),
60..63,
),
(
Int {
value: 11051210869376104954,
},
Int(
11051210869376104954,
),
64..82,
),
(
Int {
value: 0x995DC9BBDF1939FA995DC9BBDF1939FA,
},
Int(
0x995DC9BBDF1939FA995DC9BBDF1939FA,
),
83..117,
),
(
@ -94,3 +96,4 @@ expression: lex_source(source)
117..117,
),
]
```

View file

@ -2,6 +2,8 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
DoubleSlash,
@ -28,3 +30,4 @@ expression: lex_source(source)
10..10,
),
]
```

View file

@ -2,124 +2,70 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: lex_source(source)
---
## Tokens
```
[
(
String {
value: "double",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Double,
},
},
String(
"double",
),
0..8,
TokenFlags(
DOUBLE_QUOTES,
),
),
(
String {
value: "single",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"single",
),
9..17,
),
(
String {
value: "can\\'t",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"can\\'t",
),
18..26,
),
(
String {
value: "\\\\\\\"",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Double,
},
},
String(
"\\\\\\\"",
),
27..33,
TokenFlags(
DOUBLE_QUOTES,
),
),
(
String {
value: "\\t\\r\\n",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"\\t\\r\\n",
),
34..42,
),
(
String {
value: "\\g",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"\\g",
),
43..47,
),
(
String {
value: "raw\\'",
flags: AnyStringFlags {
prefix: Regular(
Raw {
uppercase: false,
},
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"raw\\'",
),
48..56,
TokenFlags(
RAW_STRING_LOWERCASE,
),
),
(
String {
value: "\\420",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"\\420",
),
57..63,
),
(
String {
value: "\\200\\0a",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Single,
},
},
String(
"\\200\\0a",
),
64..73,
),
(
@ -127,3 +73,4 @@ expression: lex_source(source)
73..73,
),
]
```

View file

@ -2,22 +2,21 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: string_continuation_with_eol(MAC_EOL)
---
## Tokens
```
[
(
String {
value: "abc\\\rdef",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Double,
},
},
String(
"abc\\\rdef",
),
0..10,
TokenFlags(
DOUBLE_QUOTES,
),
),
(
Newline,
10..10,
),
]
```

View file

@ -2,22 +2,21 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: string_continuation_with_eol(UNIX_EOL)
---
## Tokens
```
[
(
String {
value: "abc\\\ndef",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Double,
},
},
String(
"abc\\\ndef",
),
0..10,
TokenFlags(
DOUBLE_QUOTES,
),
),
(
Newline,
10..10,
),
]
```

View file

@ -2,22 +2,21 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: string_continuation_with_eol(WINDOWS_EOL)
---
## Tokens
```
[
(
String {
value: "abc\\\r\ndef",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: false,
quote_style: Double,
},
},
String(
"abc\\\r\ndef",
),
0..11,
TokenFlags(
DOUBLE_QUOTES,
),
),
(
Newline,
11..11,
),
]
```

View file

@ -1,66 +1,58 @@
---
source: crates/ruff_python_parser/src/lexer.rs
expression: tokens
expression: "lex_invalid(source, Mode::Module)"
---
## Tokens
```
[
Ok(
(
If,
0..2,
),
(
If,
0..2,
),
Ok(
(
True,
3..7,
),
(
True,
3..7,
),
Ok(
(
Colon,
7..8,
),
(
Colon,
7..8,
),
Ok(
(
Newline,
8..9,
),
(
Newline,
8..9,
),
Ok(
(
Indent,
9..13,
),
(
Indent,
9..13,
),
Ok(
(
Pass,
13..17,
),
(
Pass,
13..17,
),
Ok(
(
Newline,
17..18,
),
(
Newline,
17..18,
),
Err(
LexicalError {
error: IndentationError,
location: 18..20,
},
(
Unknown,
18..20,
),
Ok(
(
Pass,
20..24,
),
(
Pass,
20..24,
),
Ok(
(
Newline,
24..24,
),
(
Newline,
24..24,
),
]
```
## Errors
```
[
LexicalError {
error: IndentationError,
location: 18..20,
},
]
```

View file

@ -2,22 +2,21 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: triple_quoted_eol(MAC_EOL)
---
## Tokens
```
[
(
String {
value: "\r test string\r ",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: true,
quote_style: Double,
},
},
String(
"\r test string\r ",
),
0..21,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING,
),
),
(
Newline,
21..21,
),
]
```

View file

@ -2,22 +2,21 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: triple_quoted_eol(UNIX_EOL)
---
## Tokens
```
[
(
String {
value: "\n test string\n ",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: true,
quote_style: Double,
},
},
String(
"\n test string\n ",
),
0..21,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING,
),
),
(
Newline,
21..21,
),
]
```

View file

@ -2,22 +2,21 @@
source: crates/ruff_python_parser/src/lexer.rs
expression: triple_quoted_eol(WINDOWS_EOL)
---
## Tokens
```
[
(
String {
value: "\r\n test string\r\n ",
flags: AnyStringFlags {
prefix: Regular(
Empty,
),
triple_quoted: true,
quote_style: Double,
},
},
String(
"\r\n test string\r\n ",
),
0..23,
TokenFlags(
DOUBLE_QUOTES | TRIPLE_QUOTED_STRING,
),
),
(
Newline,
23..23,
),
]
```