when we switch to the rust compiler by default, mybinder stop working, as reoported in https://github.com/Instagram/LibCST/issues/1054 this is because the binder docker image does not have a rust compiler or tools, this install them by using the apt.txt file
Co-authored-by: Alvaro Leiva Geisse <aleivag@meta.com>
Moves PEP 621 metadata from `setup.py` and `requirements*.txt` into the
`[project]` table of `pyproject.toml`. This enables using hatch as a
task runner for the project, where previously one would need to remember
a bunch of different commands, or repeatedly consult the readme's
developer guide to find all of the relevant commands.
This creates the following hatch commands:
- docs
- fixtures
- format
- lint
- test
- typecheck
It also updates all of the github actions workflows to use the
appropriate hatch commands, and the readme's developer guide, so that
there is only one source of truth for what constitutes running tests.
The "test" workflows now drop the matrix distinction between "pure" or
"native", and run tests in both modes from a single build.
ghstack-source-id: 8834da7825
Pull Request resolved: https://github.com/Instagram/LibCST/pull/893
This massive PR implements an alternative Python parser that will allow LibCST to parse Python 3.10's new grammar features. The parser is implemented in Rust, but it's turned off by default through the `LIBCST_PARSER_TYPE` environment variable. Set it to `native` to enable. The PR also enables new CI steps that test just the Rust parser, as well as steps that produce binary wheels for a variety of CPython versions and platforms.
Note: this PR aims to be roughly feature-equivalent to the main branch, so it doesn't include new 3.10 syntax features. That will be addressed as a follow-up PR.
The new parser is implemented in the `native/` directory, and is organized into two rust crates: `libcst_derive` contains some macros to facilitate various features of CST nodes, and `libcst` contains the `parser` itself (including the Python grammar), a `tokenizer` implementation by @bgw, and a very basic representation of CST `nodes`. Parsing is done by
1. **tokenizing** the input utf-8 string (bytes are not supported at the Rust layer, they are converted to utf-8 strings by the python wrapper)
2. running the **PEG parser** on the tokenized input, which also captures certain anchor tokens in the resulting syntax tree
3. using the anchor tokens to **inflate** the syntax tree into a proper CST
Co-authored-by: Benjamin Woodruff <github@benjam.info>
* Use setuptools-scm to derive the current version from git metadata
* Add Github Action equivalent to the current circleci tasks
* Run pyre integration test in GH action / tox
* Add tests to verify that LibCST handles string annotations.
This is an important property for certain use cases, so it
makes sense to verify it in tests so that we can safely
depend on it.
At present, the reason we want to be able to rely on this is:
- at the moment, imports added by infer can make pysa traces
hard to understand, because the line numbers are off
- if we add the ability to use fully-qualified string annotations
for the stubs from infer, then we can do so without adding
any import lines and pyre will understand the types.
* ApplyTypeAnnotations: add unit test of how import statments are merged
Add a unit test illustrating how the codemod handles various cases
of import statments in the stub file.
Explicitly call out each of the unsupported patterns:
- bare imports (we probably should support this)
- relative imports (we probably should support this)
star imports (we probably don't want to support this)
* Add .python-version to .gitignore
This will be helpful for anyone using pyenv (I accidentally committed
my python version file in a draft branch).
- Add `libcst/__init__.py` back which I accidentally deleted in another
commit.
- Add `*.egg-info/` to the gitignore, because `libcst.egg-info` is it's
created by pip/setuptools when locally installing libcst, and it's
annoying.
- Changed the version number from `0.1.dev` to `0.1.dev0`, since pip was
warning that it was normalizing the version number from the former to
the later.
- Add a `python_requires` field, since we know that libcst only works on
3.6+.
- Add an `install_requires`. Pip uses this to find dependencies, and
ignores `requirements.txt` (since `requirements.txt` is really only
intended to be a freeze file).
- Add the dataclasses backport as a dependency for Python 3.6. I
validated that installing and using libcst works in both 3.6 and 3.7.
**Test Plan:**
```
$ python3 -m venv libcst-install-test # my system python is 3.7
$ libcst-install-test/bin/pip install --upgrade pip ipython
Cache entry deserialization failed, entry ignored
Collecting pip
Using cached
be401c0032/pip-19.1.1-py2.py3-none-any.whl
Collecting ipython
... # lots of output
$ ~/libcst-install-test/bin/pip install ~/libcst/
Processing ./libcst
Requirement already satisfied: parso in
./libcst-install-test/lib/python3.7/site-packages (from
libcst==0.1.dev0) (0.4.0)
Collecting typing_extensions (from libcst==0.1.dev0)
Using cached
c66e553258/typing_extensions-3.7.2-py3-none-any.whl
Installing collected packages: typing-extensions, libcst
Running setup.py install for libcst ... done
Successfully installed libcst-0.1.dev0 typing-extensions-3.7.2
$ ~/libcst-install-test/bin/ipython
Python 3.7.3 (default, Apr 3 2019, 05:39:12)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.5.0 -- An enhanced Interactive Python. Type '?' for help.
In [1]: from libcst import parser
In [2]: parser.parse_expression("None")
Out[2]:
Name(
value='None',
lpar=[],
rpar=[],
)
In [3]:
```
I then repeated the same with a copy of CPython 3.6 that I built from
source.