Turns out Neil didn't intend for *all* of his gen-branch work to get

committed.

tokenize.py:  I like these changes, and have tested them extensively
without even realizing it, so I just updated the docstring and the docs.

tabnanny.py:  Also liked this, but did a little code fiddling.  I should
really rewrite this to *exploit* generators, but that's near the bottom
of my effort/benefit scale so doubt I'll get to it anytime soon (it
would be most useful as a non-trivial example of ideal use of generators;
but test_generators.py has already grown plenty of food-for-thought
examples).

inspect.py:  I'm sure Ping intended for this to continue running even
under 1.5.2, so I reverted this to the last pre-gen-branch version.  The
"bugfix" I checked in in-between was actually repairing a bug *introduced*
by the conversion to generators, so it's OK that the reverted version
doesn't reflect that checkin.
This commit is contained in:
Tim Peters 2001-06-29 23:51:08 +00:00
parent 88e66254f9
commit 4efb6e9643
4 changed files with 79 additions and 47 deletions

View file

@ -349,28 +349,32 @@ class ListReader:
return self.lines[i]
else: return ''
class EndOfBlock(Exception): pass
class BlockFinder:
"""Provide a tokeneater() method to detect the end of a code block."""
def __init__(self):
self.indent = 0
self.started = 0
self.last = 0
def tokeneater(self, type, token, (srow, scol), (erow, ecol), line):
if not self.started:
if type == tokenize.NAME: self.started = 1
elif type == tokenize.NEWLINE:
self.last = srow
elif type == tokenize.INDENT:
self.indent = self.indent + 1
elif type == tokenize.DEDENT:
self.indent = self.indent - 1
if self.indent == 0: raise EndOfBlock, self.last
def getblock(lines):
"""Extract the block of code at the top of the given list of lines."""
indent = 0
started = 0
last = 0
tokens = tokenize.generate_tokens(ListReader(lines).readline)
for (type, token, (srow, scol), (erow, ecol), line) in tokens:
if not started:
if type == tokenize.NAME:
started = 1
elif type == tokenize.NEWLINE:
last = srow
elif type == tokenize.INDENT:
indent = indent + 1
elif type == tokenize.DEDENT:
indent = indent - 1
if indent == 0:
return lines[:last]
else:
raise ValueError, "unable to find block"
try:
tokenize.tokenize(ListReader(lines).readline, BlockFinder().tokeneater)
except EndOfBlock, eob:
return lines[:eob.args[0]]
def getsourcelines(object):
"""Return a list of source lines and starting line number for an object.