tokenize is just broken on test_pep3131.py

This commit is contained in:
Benjamin Peterson 2011-08-13 00:33:21 -05:00
parent be66287e20
commit 963e40256a

View file

@ -520,6 +520,9 @@ pass the '-ucpu' option to process the full directory.
>>> tempdir = os.path.dirname(f) or os.curdir
>>> testfiles = glob.glob(os.path.join(tempdir, "test*.py"))
tokenize is broken on test_pep3131.py because regular expressions are broken on
the obscure unicode identifiers in it. *sigh*
>>> testfiles.remove(os.path.join(tempdir, "test_pep3131.py"))
>>> if not support.is_resource_enabled("cpu"):
... testfiles = random.sample(testfiles, 10)
...