Add Tokens newtype wrapper, TokenKind iterator (#11361)

## Summary

Alternative to #11237 

This PR adds a new `Tokens` struct which is a newtype wrapper around a
vector of lexer output. This allows us to add a `kinds` method which
returns an iterator over the corresponding `TokenKind`. This iterator is
implemented as a separate `TokenKindIter` struct to allow using the type
and provide additional methods like `peek` directly on the iterator.

This exposes the linter to access the stream of `TokenKind` instead of
`Tok`.

Edit: I've made the necessary downstream changes and plan to merge the
entire stack at once.
This commit is contained in:
Dhruv Manilawala 2024-05-14 22:15:04 +05:30 committed by GitHub
parent 50f14d017e
commit 025768d303
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
9 changed files with 142 additions and 25 deletions

View file

@ -13,7 +13,6 @@ use ruff_linter::{
use ruff_python_ast::PySourceType;
use ruff_python_codegen::Stylist;
use ruff_python_index::Indexer;
use ruff_python_parser::lexer::LexResult;
use ruff_python_parser::AsMode;
use ruff_source_file::Locator;
use ruff_text_size::Ranged;
@ -76,7 +75,7 @@ pub(crate) fn check(
let source_kind = SourceKind::Python(contents.to_string());
// Tokenize once.
let tokens: Vec<LexResult> = ruff_python_parser::tokenize(contents, source_type.as_mode());
let tokens = ruff_python_parser::tokenize(contents, source_type.as_mode());
// Map row and column locations to byte slices (lazily).
let locator = Locator::with_index(contents, index);