mirror of
https://github.com/astral-sh/ruff.git
synced 2025-08-04 10:48:32 +00:00
[ty] AST garbage collection (#18482)
## Summary Garbage collect ASTs once we are done checking a given file. Queries with a cross-file dependency on the AST will reparse the file on demand. This reduces ty's peak memory usage by ~20-30%. The primary change of this PR is adding a `node_index` field to every AST node, that is assigned by the parser. `ParsedModule` can use this to create a flat index of AST nodes any time the file is parsed (or reparsed). This allows `AstNodeRef` to simply index into the current instance of the `ParsedModule`, instead of storing a pointer directly. The indices are somewhat hackily (using an atomic integer) assigned by the `parsed_module` query instead of by the parser directly. Assigning the indices in source-order in the (recursive) parser turns out to be difficult, and collecting the nodes during semantic indexing is impossible as `SemanticIndex` does not hold onto a specific `ParsedModuleRef`, which the pointers in the flat AST are tied to. This means that we have to do an extra AST traversal to assign and collect the nodes into a flat index, but the small performance impact (~3% on cold runs) seems worth it for the memory savings. Part of https://github.com/astral-sh/ty/issues/214.
This commit is contained in:
parent
76d9009a6e
commit
c9dff5c7d5
824 changed files with 25243 additions and 804 deletions
|
@ -178,10 +178,18 @@ where
|
|||
V: Visitor<'a> + ?Sized,
|
||||
{
|
||||
match module {
|
||||
ast::Mod::Module(ast::ModModule { body, range: _ }) => {
|
||||
ast::Mod::Module(ast::ModModule {
|
||||
body,
|
||||
range: _,
|
||||
node_index: _,
|
||||
}) => {
|
||||
visitor.visit_body(body);
|
||||
}
|
||||
ast::Mod::Expression(ast::ModExpression { body, range: _ }) => visitor.visit_expr(body),
|
||||
ast::Mod::Expression(ast::ModExpression {
|
||||
body,
|
||||
range: _,
|
||||
node_index: _,
|
||||
}) => visitor.visit_expr(body),
|
||||
}
|
||||
}
|
||||
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue