[red-knot] Goto type definition (#16901)

## Summary

Implement basic *Goto type definition* support for Red Knot's LSP.

This PR also builds the foundation for other LSP operations. E.g., Goto
definition, hover, etc., should be able to reuse some, if not most,
logic introduced in this PR.

The basic steps of resolving the type definitions are:

1. Find the closest token for the cursor offset. This is a bit more
subtle than I first anticipated because the cursor could be positioned
right between the callee and the `(` in `call(test)`, in which case we
want to resolve the type for `call`.
2. Find the node with the minimal range that fully encloses the token
found in 1. I somewhat suspect that 1 and 2 could be done at the same
time but it complicated things because we also need to compute the spine
(ancestor chain) for the node and there's no guarantee that the found
nodes have the same ancestors
3. Reduce the node found in 2. to a node that is a valid goto target.
This may require traversing upwards to e.g. find the closest expression.
4. Resolve the type for the goto target
5. Resolve the location for the type, return it to the LSP

## Design decisions

The current implementation navigates to the inferred type. I think this
is what we want because it means that it correctly accounts for
narrowing (in which case we want to go to the narrowed type because
that's the value's type at the given position). However, it does have
the downside that Goto type definition doesn't work whenever we infer `T
& Unknown` because intersection types aren't supported. I'm not sure
what to do about this specific case, other than maybe ignoring `Unkown`
in Goto type definition if the type is an intersection?

## Known limitations

* Types defined in the vendored typeshed aren't supported because the
client can't open files from the red knot binary (we can either
implement our own file protocol and handler OR extract the typeshed
files and point there). See
https://github.com/astral-sh/ruff/issues/17041
* Red Knot only exposes an API to get types for expressions and
definitions. However, there are many other nodes with identifiers that
can have a type (e.g. go to type of a globals statement, match patterns,
...). We can add support for those in separate PRs (after we figure out
how to query the types from the semantic model). See
https://github.com/astral-sh/ruff/issues/17113
* We should have a higher-level API for the LSP that doesn't directly
call semantic queries. I intentionally decided not to design that API
just yet.


## Test plan


https://github.com/user-attachments/assets/fa077297-a42d-4ec8-b71f-90c0802b4edb

Goto type definition on a union

<img width="1215" alt="Screenshot 2025-04-01 at 13 02 55"
src="https://github.com/user-attachments/assets/689cabcc-4a86-4a18-b14a-c56f56868085"
/>



Note: I recorded this using a custom typeshed path so that navigating to
builtins works.
This commit is contained in:
Micha Reiser 2025-04-02 14:12:48 +02:00 committed by GitHub
parent 7e97910704
commit 2ae39edccf
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
32 changed files with 1765 additions and 84 deletions

View file

@ -63,7 +63,6 @@
//! [lexical analysis]: https://en.wikipedia.org/wiki/Lexical_analysis
//! [parsing]: https://en.wikipedia.org/wiki/Parsing
//! [lexer]: crate::lexer
use std::iter::FusedIterator;
use std::ops::Deref;
@ -558,6 +557,86 @@ impl Tokens {
}
}
/// Searches the token(s) at `offset`.
///
/// Returns [`TokenAt::Between`] if `offset` points directly inbetween two tokens
/// (the left token ends at `offset` and the right token starts at `offset`).
///
///
/// ## Examples
///
/// [Playground](https://play.ruff.rs/f3ad0a55-5931-4a13-96c7-b2b8bfdc9a2e?secondary=Tokens)
///
/// ```
/// # use ruff_python_ast::PySourceType;
/// # use ruff_python_parser::{Token, TokenAt, TokenKind};
/// # use ruff_text_size::{Ranged, TextSize};
///
/// let source = r#"
/// def test(arg):
/// arg.call()
/// if True:
/// pass
/// print("true")
/// "#.trim();
///
/// let parsed = ruff_python_parser::parse_unchecked_source(source, PySourceType::Python);
/// let tokens = parsed.tokens();
///
/// let collect_tokens = |offset: TextSize| {
/// tokens.at_offset(offset).into_iter().map(|t| (t.kind(), &source[t.range()])).collect::<Vec<_>>()
/// };
///
/// assert_eq!(collect_tokens(TextSize::new(4)), vec! [(TokenKind::Name, "test")]);
/// assert_eq!(collect_tokens(TextSize::new(6)), vec! [(TokenKind::Name, "test")]);
/// // between `arg` and `.`
/// assert_eq!(collect_tokens(TextSize::new(22)), vec! [(TokenKind::Name, "arg"), (TokenKind::Dot, ".")]);
/// assert_eq!(collect_tokens(TextSize::new(36)), vec! [(TokenKind::If, "if")]);
/// // Before the dedent token
/// assert_eq!(collect_tokens(TextSize::new(57)), vec! []);
/// ```
pub fn at_offset(&self, offset: TextSize) -> TokenAt {
match self.binary_search_by_key(&offset, ruff_text_size::Ranged::start) {
// The token at `index` starts exactly at `offset.
// ```python
// object.attribute
// ^ OFFSET
// ```
Ok(index) => {
let token = self[index];
// `token` starts exactly at `offset`. Test if the offset is right between
// `token` and the previous token (if there's any)
if let Some(previous) = index.checked_sub(1).map(|idx| self[idx]) {
if previous.end() == offset {
return TokenAt::Between(previous, token);
}
}
TokenAt::Single(token)
}
// No token found that starts exactly at the given offset. But it's possible that
// the token starting before `offset` fully encloses `offset` (it's end range ends after `offset`).
// ```python
// object.attribute
// ^ OFFSET
// # or
// if True:
// print("test")
// ^ OFFSET
// ```
Err(index) => {
if let Some(previous) = index.checked_sub(1).map(|idx| self[idx]) {
if previous.range().contains_inclusive(offset) {
return TokenAt::Single(previous);
}
}
TokenAt::None
}
}
}
/// Returns a slice of tokens after the given [`TextSize`] offset.
///
/// If the given offset is between two tokens, the returned slice will start from the following
@ -610,6 +689,39 @@ impl Deref for Tokens {
}
}
/// A token that encloses a given offset or ends exactly at it.
pub enum TokenAt {
/// There's no token at the given offset
None,
/// There's a single token at the given offset.
Single(Token),
/// The offset falls exactly between two tokens. E.g. `CURSOR` in `call<CURSOR>(arguments)` is
/// positioned exactly between the `call` and `(` tokens.
Between(Token, Token),
}
impl Iterator for TokenAt {
type Item = Token;
fn next(&mut self) -> Option<Self::Item> {
match *self {
TokenAt::None => None,
TokenAt::Single(token) => {
*self = TokenAt::None;
Some(token)
}
TokenAt::Between(first, second) => {
*self = TokenAt::Single(second);
Some(first)
}
}
}
}
impl FusedIterator for TokenAt {}
impl From<&Tokens> for CommentRanges {
fn from(tokens: &Tokens) -> Self {
let mut ranges = vec![];