Merged revisions 68503,68507,68694,69054,69673,69679-69681,70991,70999,71003,71695 via svnmerge from
svn+ssh://pythondev@svn.python.org/sandbox/trunk/2to3/lib2to3

........
  r68503 | benjamin.peterson | 2009-01-10 14:14:49 -0600 (Sat, 10 Jan 2009) | 1 line

  use variable
........
  r68507 | benjamin.peterson | 2009-01-10 15:13:16 -0600 (Sat, 10 Jan 2009) | 1 line

  rewrap
........
  r68694 | benjamin.peterson | 2009-01-17 17:55:59 -0600 (Sat, 17 Jan 2009) | 1 line

  test for specific node type
........
  r69054 | guilherme.polo | 2009-01-28 10:01:54 -0600 (Wed, 28 Jan 2009) | 2 lines

  Added mapping for the ttk module.
........
  r69673 | benjamin.peterson | 2009-02-16 09:38:22 -0600 (Mon, 16 Feb 2009) | 1 line

  fix handling of as imports #5279
........
  r69679 | benjamin.peterson | 2009-02-16 11:36:06 -0600 (Mon, 16 Feb 2009) | 1 line

  make Base.get_next_sibling() and Base.get_prev_sibling() properties
........
  r69680 | benjamin.peterson | 2009-02-16 11:41:48 -0600 (Mon, 16 Feb 2009) | 1 line

  normalize docstrings in pytree according to PEP 11
........
  r69681 | benjamin.peterson | 2009-02-16 11:43:09 -0600 (Mon, 16 Feb 2009) | 1 line

  use a set
........
  r70991 | benjamin.peterson | 2009-04-01 15:54:50 -0500 (Wed, 01 Apr 2009) | 1 line

  map urllib.urlopen to urllib.request.open #5637
........
  r70999 | benjamin.peterson | 2009-04-01 17:36:47 -0500 (Wed, 01 Apr 2009) | 1 line

  add very alpha support to 2to3 for running concurrently with multiprocessing
........
  r71003 | benjamin.peterson | 2009-04-01 18:10:43 -0500 (Wed, 01 Apr 2009) | 1 line

  fix when multiprocessing is not available or used
........
  r71695 | benjamin.peterson | 2009-04-17 22:21:29 -0500 (Fri, 17 Apr 2009) | 1 line

  refactor multiprocessing support, so it's less hacky to employ and only loads mp when needed
........
diff --git a/Lib/lib2to3/patcomp.py b/Lib/lib2to3/patcomp.py
index 353b960..ab525f2 100644
--- a/Lib/lib2to3/patcomp.py
+++ b/Lib/lib2to3/patcomp.py
@@ -30,7 +30,7 @@
 
 def tokenize_wrapper(input):
     """Tokenizes a string suppressing significant whitespace."""
-    skip = (token.NEWLINE, token.INDENT, token.DEDENT)
+    skip = set((token.NEWLINE, token.INDENT, token.DEDENT))
     tokens = tokenize.generate_tokens(driver.generate_lines(input).next)
     for quintuple in tokens:
         type, value, start, end, line_text = quintuple