We decided that their type mapping is not ready for public API yet. For
example for duration we'd like to replace the opaque i64 with another
opaque type that however has convenient conversions to
std::time::Duration or std::chrono::milliseconds. We can't use either
directly because we need ffi compatibility, in order to create an
instance in C++ and pass it to the Rust run-time.
So this hides such properties and instead produces a warning.
Improve the Parser functionality allowing access to the nth Token,
resulting in a much more flexible API.
Signed-off-by: Patrick José Pereira <patrickelectric@gmail.com>
Sometimes re-usable components need to act as containers that allow the
user to place other items inside. The component needs to be able to
control the placement of these user-provided elements. That is what the
new
$children
expression inside elements does.
Instead of doing the SyntaxNodeWithSourceFile construction as late as
possible (in Document::from_node), we can do it as early as possible.
That'll reduce the chances of missing out the source file and prepares
for dependency loading just based on a SNWS :)
This also makes the source_file optional in the SNWS, but that's
consistent with what the diagnostics expect.
Avoid unnecessary SyntaxKind variants. The import uri is simply the first string literal in the ImportSpecifier.
Also covert the case of an empty import list and use expect() instead of
test() to produce a meaningful error message.
When going from the plain rowan::SyntaxNode tree to the syntax_nodes::*
elements, attach the source file and keep track of it from that point
on. That'll pave the way for proper multi-file diagnostics generated
later on from the passes, where we store syntax_nodes::* types.
Let the bulk of the push_error() calls take a Spanned trait impl, so
that we can pass node on the call sites. Then when later change the
underyling trait to pass something that can also provide the source file
and we don't have to change all call sites again.
Don't require the callers to hold on to the source code string until an
eventual diagnostics code path is hit. Instead it turns out it's
simpler to let the parser consume the source code as string, where
internally after tokenizing it can be moved into the diagnostics and
from there into the code map if needed.
There are a few places where we now clone the source code, but that's
only in cases where we also extract stuff separately (test code) or the
syntax updater.