Merged revisions 53005-53303 via svnmerge from
svn+ssh://pythondev@svn.python.org/python/trunk

........
  r53012 | walter.doerwald | 2006-12-12 22:55:31 +0100 (Tue, 12 Dec 2006) | 2 lines

  Fix typo.
........
  r53023 | brett.cannon | 2006-12-13 23:31:37 +0100 (Wed, 13 Dec 2006) | 2 lines

  Remove an unneeded import of 'warnings'.
........
  r53025 | brett.cannon | 2006-12-14 00:02:38 +0100 (Thu, 14 Dec 2006) | 2 lines

  Remove unneeded imports of 'warnings'.
........
  r53026 | brett.cannon | 2006-12-14 00:09:53 +0100 (Thu, 14 Dec 2006) | 4 lines

  Add test.test_support.guard_warnings_filter .  This function returns a context
  manager that protects warnings.filter from being modified once the context is
  exited.
........
  r53029 | george.yoshida | 2006-12-14 03:22:44 +0100 (Thu, 14 Dec 2006) | 2 lines

  Note that guard_warnings_filter was added in 2.6
........
  r53031 | vinay.sajip | 2006-12-14 09:53:55 +0100 (Thu, 14 Dec 2006) | 1 line

  Added news on recent changes to logging
........
  r53032 | andrew.kuchling | 2006-12-14 19:57:53 +0100 (Thu, 14 Dec 2006) | 1 line

  [Patch #1599256 from David Watson] check that os.fsync is available before using it
........
  r53042 | kurt.kaiser | 2006-12-15 06:13:11 +0100 (Fri, 15 Dec 2006) | 6 lines

  1. Avoid hang when encountering a duplicate in a completion list. Bug 1571112.
  2. Duplicate some old entries from Python's NEWS to IDLE's NEWS.txt

  M    AutoCompleteWindow.py
  M    NEWS.txt
........
  r53048 | andrew.kuchling | 2006-12-18 18:12:31 +0100 (Mon, 18 Dec 2006) | 1 line

  [Bug #1618083] Add missing word; make a few grammar fixes
........
  r53050 | andrew.kuchling | 2006-12-18 18:16:05 +0100 (Mon, 18 Dec 2006) | 1 line

  Bump version
........
  r53051 | andrew.kuchling | 2006-12-18 18:22:07 +0100 (Mon, 18 Dec 2006) | 1 line

  [Bug #1616726] Fix description of generator.close(); if you raise some random exception, the exception is raised and doesn't trigger a RuntimeError
........
  r53052 | andrew.kuchling | 2006-12-18 18:38:14 +0100 (Mon, 18 Dec 2006) | 1 line

  Describe new methods in Queue module
........
  r53053 | andrew.kuchling | 2006-12-18 20:22:24 +0100 (Mon, 18 Dec 2006) | 1 line

  [Patch #1615868 by Lars Gustaebel] Use Py_off_t to fix BZ2File.seek() for offsets > 2Gb
........
  r53057 | andrew.kuchling | 2006-12-18 22:29:07 +0100 (Mon, 18 Dec 2006) | 1 line

  Fix markup
........
  r53063 | thomas.wouters | 2006-12-19 09:17:50 +0100 (Tue, 19 Dec 2006) | 5 lines


  Make sre's SubPattern objects accept slice objects like it already accepts
  simple slices.
........
  r53065 | andrew.kuchling | 2006-12-19 15:13:05 +0100 (Tue, 19 Dec 2006) | 6 lines

  [Patch #1618455 by Ben Maurer] Improve speed of HMAC by using str.translate()
     instead of a more general XOR that has to construct a list.

  Slightly modified from Maurer's patch: the _strxor() function is no longer
  necessary at all.
........
  r53066 | andrew.kuchling | 2006-12-19 15:28:23 +0100 (Tue, 19 Dec 2006) | 9 lines

  [Bug #1613651] Document socket.recv_into, socket.recvfrom_into

  Also, the text for recvfrom told you to read recv() for an explanation of the
  'flags' argument, but recv() just pointed you at the man page.  Copied the
  man-page text to recvfrom(), recvfrom_into, recv_into to avoid the pointless
  redirection.

  I don't have LaTeX on this machine; hope my markup is OK.
........
  r53067 | andrew.kuchling | 2006-12-19 15:29:04 +0100 (Tue, 19 Dec 2006) | 1 line

  Comment typo
........
  r53068 | andrew.kuchling | 2006-12-19 16:11:41 +0100 (Tue, 19 Dec 2006) | 1 line

  [Patch #1617413 from Dug Song] Fix HTTP Basic authentication via HTTPS
........
  r53071 | andrew.kuchling | 2006-12-19 16:18:12 +0100 (Tue, 19 Dec 2006) | 1 line

  [Patch #1600491 from Jim Jewett] Describe how to build help files on Windows
........
  r53073 | andrew.kuchling | 2006-12-19 16:43:10 +0100 (Tue, 19 Dec 2006) | 6 lines

  [Patch #1587139 by kxroberto] Protect lock acquisition/release with
  try...finally to ensure the lock is always released.  This could use
  the 'with' statement, but the patch uses 'finally'.

  2.5 backport candidate.
........
  r53074 | vinay.sajip | 2006-12-19 19:29:11 +0100 (Tue, 19 Dec 2006) | 1 line

  Updated documentation for findCaller() to indicate that a 3-tuple is now returned, rather than a 2-tuple.
........
  r53090 | georg.brandl | 2006-12-19 23:06:46 +0100 (Tue, 19 Dec 2006) | 3 lines

  Patch #1484695: The tarfile module now raises a HeaderError exception
  if a buffer given to frombuf() is invalid.
........
  r53099 | raymond.hettinger | 2006-12-20 07:42:06 +0100 (Wed, 20 Dec 2006) | 5 lines

  Bug #1590891:   random.randrange don't return correct value for big number

  Needs to be backported.
........
  r53106 | georg.brandl | 2006-12-20 12:55:16 +0100 (Wed, 20 Dec 2006) | 3 lines

  Testcase for patch #1484695.
........
  r53110 | andrew.kuchling | 2006-12-20 20:48:20 +0100 (Wed, 20 Dec 2006) | 17 lines

  [Apply length-checking.diff from bug #1599254]

  Add length checking to single-file mailbox formats: before doing a
  flush() on a mailbox, seek to the end and verify its length is
  unchanged, raising ExternalClashError if the file's length has
  changed.

  This fix avoids potential data loss if some other process appends to
  the mailbox file after the table of contents has been generated;
  instead of overwriting the modified file, you'll get the exception.

  I also noticed that the self._lookup() call in self.flush() wasn't
  necessary (everything that sets self._pending to True also calls
  self.lookup()), and replaced it by an assertion.

  2.5 backport candidate.
........
  r53112 | andrew.kuchling | 2006-12-20 20:57:10 +0100 (Wed, 20 Dec 2006) | 1 line

  [Bug #1619674] Make sum() use the term iterable, not sequence
........
  r53113 | andrew.kuchling | 2006-12-20 20:58:11 +0100 (Wed, 20 Dec 2006) | 1 line

  Two grammar fixes
........
  r53115 | andrew.kuchling | 2006-12-20 21:11:12 +0100 (Wed, 20 Dec 2006) | 5 lines

  Some other built-in functions are described with 'sequence' arguments
  that should really be 'iterable'; this commit changes them.

  Did I miss any?  Did I introduce any errors?
........
  r53117 | andrew.kuchling | 2006-12-20 21:20:42 +0100 (Wed, 20 Dec 2006) | 1 line

  [Bug #1619680] in_dll() arguments are documented in the wrong order
........
  r53120 | neal.norwitz | 2006-12-21 05:38:00 +0100 (Thu, 21 Dec 2006) | 1 line

  Lars asked for permission on on python-dev for work on tarfile.py
........
  r53125 | andrew.kuchling | 2006-12-21 14:40:29 +0100 (Thu, 21 Dec 2006) | 1 line

  Mention the os.SEEK_* constants
........
  r53129 | walter.doerwald | 2006-12-21 19:06:30 +0100 (Thu, 21 Dec 2006) | 2 lines

  Fix typo.
........
  r53131 | thomas.heller | 2006-12-21 19:30:56 +0100 (Thu, 21 Dec 2006) | 3 lines

  Fix wrong markup of an argument in a method signature.
  Will backport.
........
  r53137 | andrew.kuchling | 2006-12-22 01:50:56 +0100 (Fri, 22 Dec 2006) | 1 line

  Typo fix
........
  r53139 | andrew.kuchling | 2006-12-22 14:25:02 +0100 (Fri, 22 Dec 2006) | 1 line

  [Bug #737202; fix from Titus Brown] Make CGIHTTPServer work for scripts in sub-directories
........
  r53141 | andrew.kuchling | 2006-12-22 16:04:45 +0100 (Fri, 22 Dec 2006) | 6 lines

  [Bug #802128] Make the mode argument of dumbdbm actually work the way it's
  described, and add a test for it.

  2.5 bugfix candidate, maybe; arguably this patch changes the API of
  dumbdbm and shouldn't be added in a point-release.
........
  r53142 | andrew.kuchling | 2006-12-22 16:16:58 +0100 (Fri, 22 Dec 2006) | 6 lines

  [Bug #802128 continued] Modify mode depending on the process umask.

  Is there really no other way to read the umask than to set it?

  Hope this works on Windows...
........
  r53145 | andrew.kuchling | 2006-12-22 17:43:26 +0100 (Fri, 22 Dec 2006) | 1 line

  [Bug #776202] Apply Walter Doerwald's patch to use text mode for encoded files
........
  r53146 | andrew.kuchling | 2006-12-22 19:41:42 +0100 (Fri, 22 Dec 2006) | 9 lines

  [Patch #783050 from Patrick Lynch] The emulation of forkpty() is incorrect;
  the master should close the slave fd.

  Added a test to test_pty.py that reads from the master_fd after doing
  a pty.fork(); without the fix it hangs forever instead of raising an
  exception.  (<crossing fingers for the buildbots>)

  2.5 backport candidate.
........
  r53147 | andrew.kuchling | 2006-12-22 20:06:16 +0100 (Fri, 22 Dec 2006) | 1 line

  [Patch #827559 from Chris Gonnerman] Make SimpleHTTPServer redirect when a directory URL is missing the trailing slash; this lets relative links work.
........
  r53149 | andrew.kuchling | 2006-12-22 20:21:27 +0100 (Fri, 22 Dec 2006) | 1 line

  Darn; this test works when you run test_pty.py directly, but fails when regrtest runs it (the os.read() raises os.error).  I can't figure out the cause, so am commenting out the test.
........
  r53150 | andrew.kuchling | 2006-12-22 22:48:19 +0100 (Fri, 22 Dec 2006) | 1 line

  Frak; this test also fails
........
  r53153 | lars.gustaebel | 2006-12-23 17:40:13 +0100 (Sat, 23 Dec 2006) | 5 lines

  Patch #1230446: tarfile.py: fix ExFileObject so that read() and tell()
  work correctly together with readline().

  Will backport to 2.5.
........
  r53155 | lars.gustaebel | 2006-12-23 18:57:23 +0100 (Sat, 23 Dec 2006) | 5 lines

  Patch #1262036: Prevent TarFiles from being added to themselves under
  certain conditions.

  Will backport to 2.5.
........
  r53159 | andrew.kuchling | 2006-12-27 04:25:31 +0100 (Wed, 27 Dec 2006) | 4 lines

  [Part of patch #1182394] Move the HMAC blocksize to be a class-level
  constant; this allows changing it in a subclass.  To accommodate this,
  copy() now uses __class__.  Also add some text to a comment.
........
  r53160 | andrew.kuchling | 2006-12-27 04:31:24 +0100 (Wed, 27 Dec 2006) | 1 line

  [Rest of patch #1182394] Add ._current() method so that we can use the written-in-C .hexdigest() method
........
  r53161 | lars.gustaebel | 2006-12-27 11:30:46 +0100 (Wed, 27 Dec 2006) | 4 lines

  Patch #1504073: Fix tarfile.open() for mode "r" with a fileobj argument.

  Will backport to 2.5.
........
  r53165 | neal.norwitz | 2006-12-28 05:39:20 +0100 (Thu, 28 Dec 2006) | 1 line

  Remove a stray (old) macro name left around (I guess)
........
  r53188 | neal.norwitz | 2006-12-29 04:01:53 +0100 (Fri, 29 Dec 2006) | 1 line

  SF bug #1623890, fix argument name in docstring
........
  r53200 | raymond.hettinger | 2006-12-30 05:01:17 +0100 (Sat, 30 Dec 2006) | 1 line

  For sets with cyclical reprs, emit an ellipsis instead of infinitely recursing.
........
  r53232 | brett.cannon | 2007-01-04 01:23:49 +0100 (Thu, 04 Jan 2007) | 3 lines

  Add EnvironmentVarGuard to test.test_support.  Provides a context manager to
  temporarily set or unset environment variables.
........
  r53235 | neal.norwitz | 2007-01-04 07:25:31 +0100 (Thu, 04 Jan 2007) | 1 line

  SF #1627373, fix typo in CarbonEvt.
........
  r53244 | raymond.hettinger | 2007-01-04 18:53:34 +0100 (Thu, 04 Jan 2007) | 1 line

  Fix stability of heapq's nlargest() and nsmallest().
........
  r53249 | martin.v.loewis | 2007-01-04 22:06:12 +0100 (Thu, 04 Jan 2007) | 3 lines

  Bug #1566280: Explicitly invoke threading._shutdown from Py_Main,
  to avoid relying on atexit.
  Will backport to 2.5.
........
  r53252 | gregory.p.smith | 2007-01-05 02:59:42 +0100 (Fri, 05 Jan 2007) | 3 lines

  Support linking of the bsddb module against BerkeleyDB 4.5.x
  (will backport to 2.5)
........
  r53253 | gregory.p.smith | 2007-01-05 03:06:17 +0100 (Fri, 05 Jan 2007) | 2 lines

  bump module version to match supported berkeleydb version
........
  r53255 | neal.norwitz | 2007-01-05 06:25:22 +0100 (Fri, 05 Jan 2007) | 6 lines

  Prevent crash on shutdown which can occur if we are finalizing
  and the module dict has been cleared already and some object
  raises a warning (like in a __del__).

  Will backport.
........
  r53258 | gregory.p.smith | 2007-01-05 08:21:35 +0100 (Fri, 05 Jan 2007) | 2 lines

  typo fix
........
  r53260 | neal.norwitz | 2007-01-05 09:06:43 +0100 (Fri, 05 Jan 2007) | 1 line

  Add Collin Winter for access to update PEP 3107
........
  r53262 | andrew.kuchling | 2007-01-05 15:22:17 +0100 (Fri, 05 Jan 2007) | 1 line

  [Bug #1622533] Make docstrings raw strings because they contain control characters (\0, \1)
........
  r53264 | andrew.kuchling | 2007-01-05 16:51:24 +0100 (Fri, 05 Jan 2007) | 1 line

  [Patch #1520904] Fix bsddb tests to write to the temp directory instead of the Lib/bsddb/test directory
........
  r53279 | brett.cannon | 2007-01-05 22:45:09 +0100 (Fri, 05 Jan 2007) | 3 lines

  Silence a warning from gcc 4.0.1 by specifying a function's parameter list is
  'void' instead of just a set of empty parentheses.
........
  r53285 | raymond.hettinger | 2007-01-06 02:14:41 +0100 (Sat, 06 Jan 2007) | 2 lines

  SF# 1409443:  Expand comment to cover the interaction between f->f_lasti and the PREDICT macros.
........
  r53286 | anthony.baxter | 2007-01-06 05:45:54 +0100 (Sat, 06 Jan 2007) | 1 line

  update to (c) years to include 2007
........
  r53291 | neal.norwitz | 2007-01-06 22:24:35 +0100 (Sat, 06 Jan 2007) | 1 line

  Add Josiah to SF for maintaining asyncore/asynchat
........
  r53293 | peter.astrand | 2007-01-07 09:53:46 +0100 (Sun, 07 Jan 2007) | 1 line

  Re-implemented fix for #1531862 once again, in a way that works with Python 2.2. Fixes bug #1603424.
........
  r53295 | peter.astrand | 2007-01-07 15:34:16 +0100 (Sun, 07 Jan 2007) | 1 line

  Avoid O(N**2) bottleneck in _communicate_(). Fixes #1598181.
........
  r53300 | raymond.hettinger | 2007-01-08 19:09:20 +0100 (Mon, 08 Jan 2007) | 1 line

  Fix zero-length corner case for iterating over a mutating deque.
........
  r53301 | vinay.sajip | 2007-01-08 19:50:32 +0100 (Mon, 08 Jan 2007) | 4 lines

  Bare except clause removed from SMTPHandler.emit(). Now, only ImportError is trapped.
  Bare except clause removed from SocketHandler.createSocket(). Now, only socket.error is trapped.
  (SF #411881)
........
  r53302 | vinay.sajip | 2007-01-08 19:51:46 +0100 (Mon, 08 Jan 2007) | 2 lines

  Bare except clause removed from LogRecord.__init__. Now, only ValueError, TypeError and AttributeError are trapped.
  (SF #411881)
........
  r53303 | vinay.sajip | 2007-01-08 19:52:36 +0100 (Mon, 08 Jan 2007) | 1 line

  Added entries about removal of some bare except clauses from logging.
........
diff --git a/Lib/CGIHTTPServer.py b/Lib/CGIHTTPServer.py
index 7a5c819..c119c9a 100644
--- a/Lib/CGIHTTPServer.py
+++ b/Lib/CGIHTTPServer.py
@@ -105,17 +105,36 @@
 
     def run_cgi(self):
         """Execute a CGI script."""
+        path = self.path
         dir, rest = self.cgi_info
+        
+        i = path.find('/', len(dir) + 1)
+        while i >= 0:
+            nextdir = path[:i]
+            nextrest = path[i+1:]
+
+            scriptdir = self.translate_path(nextdir)
+            if os.path.isdir(scriptdir):
+                dir, rest = nextdir, nextrest
+                i = path.find('/', len(dir) + 1)
+            else:
+                break
+
+        # find an explicit query string, if present.
         i = rest.rfind('?')
         if i >= 0:
             rest, query = rest[:i], rest[i+1:]
         else:
             query = ''
+
+        # dissect the part after the directory name into a script name &
+        # a possible additional path, to be stored in PATH_INFO.
         i = rest.find('/')
         if i >= 0:
             script, rest = rest[:i], rest[i:]
         else:
             script, rest = rest, ''
+
         scriptname = dir + '/' + script
         scriptfile = self.translate_path(scriptname)
         if not os.path.exists(scriptfile):
diff --git a/Lib/SimpleHTTPServer.py b/Lib/SimpleHTTPServer.py
index fae551a..86c669e 100644
--- a/Lib/SimpleHTTPServer.py
+++ b/Lib/SimpleHTTPServer.py
@@ -66,6 +66,12 @@
         path = self.translate_path(self.path)
         f = None
         if os.path.isdir(path):
+            if not self.path.endswith('/'):
+                # redirect browser - doing basically what apache does
+                self.send_response(301)
+                self.send_header("Location", self.path + "/")
+                self.end_headers()
+                return None
             for index in "index.html", "index.htm":
                 index = os.path.join(path, index)
                 if os.path.exists(index):
diff --git a/Lib/StringIO.py b/Lib/StringIO.py
index 7d57d80..9394360 100644
--- a/Lib/StringIO.py
+++ b/Lib/StringIO.py
@@ -139,7 +139,7 @@
         return r
 
     def readline(self, length=None):
-        """Read one entire line from the file.
+        r"""Read one entire line from the file.
 
         A trailing newline character is kept in the string (but may be absent
         when a file ends with an incomplete line). If the size argument is
diff --git a/Lib/bsddb/dbobj.py b/Lib/bsddb/dbobj.py
index 346c1ad..987f773 100644
--- a/Lib/bsddb/dbobj.py
+++ b/Lib/bsddb/dbobj.py
@@ -55,8 +55,9 @@
         return self._cobj.set_lg_max(*args, **kwargs)
     def set_lk_detect(self, *args, **kwargs):
         return self._cobj.set_lk_detect(*args, **kwargs)
-    def set_lk_max(self, *args, **kwargs):
-        return self._cobj.set_lk_max(*args, **kwargs)
+    if db.version() < (4,5):
+        def set_lk_max(self, *args, **kwargs):
+            return self._cobj.set_lk_max(*args, **kwargs)
     def set_lk_max_locks(self, *args, **kwargs):
         return self._cobj.set_lk_max_locks(*args, **kwargs)
     def set_lk_max_lockers(self, *args, **kwargs):
diff --git a/Lib/bsddb/test/test_1413192.py b/Lib/bsddb/test/test_1413192.py
index 3c13536..436f407 100644
--- a/Lib/bsddb/test/test_1413192.py
+++ b/Lib/bsddb/test/test_1413192.py
@@ -14,7 +14,7 @@
 env_name = '.'
 
 env = db.DBEnv()
-env.open(env_name, db.DB_CREATE | db.DB_INIT_TXN)
+env.open(env_name, db.DB_CREATE | db.DB_INIT_TXN | db.DB_INIT_MPOOL)
 the_txn = env.txn_begin()
 
 map = db.DB(env)
diff --git a/Lib/bsddb/test/test_associate.py b/Lib/bsddb/test/test_associate.py
index 33a7837..7ae7c53 100644
--- a/Lib/bsddb/test/test_associate.py
+++ b/Lib/bsddb/test/test_associate.py
@@ -91,7 +91,7 @@
 class AssociateErrorTestCase(unittest.TestCase):
     def setUp(self):
         self.filename = self.__class__.__name__ + '.db'
-        homeDir = os.path.join(os.path.dirname(sys.argv[0]), 'db_home')
+        homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
         self.homeDir = homeDir
         try:
             os.mkdir(homeDir)
diff --git a/Lib/bsddb/test/test_basics.py b/Lib/bsddb/test/test_basics.py
index e0452df..48ecdb9 100644
--- a/Lib/bsddb/test/test_basics.py
+++ b/Lib/bsddb/test/test_basics.py
@@ -54,7 +54,7 @@
 
     def setUp(self):
         if self.useEnv:
-            homeDir = os.path.join(os.path.dirname(sys.argv[0]), 'db_home')
+            homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
             self.homeDir = homeDir
             try:
                 shutil.rmtree(homeDir)
diff --git a/Lib/bsddb/test/test_dbobj.py b/Lib/bsddb/test/test_dbobj.py
index bba6a5b..b15de2f 100644
--- a/Lib/bsddb/test/test_dbobj.py
+++ b/Lib/bsddb/test/test_dbobj.py
@@ -2,6 +2,7 @@
 import sys, os, string
 import unittest
 import glob
+import tempfile
 
 try:
     # For Pythons w/distutils pybsddb
@@ -19,7 +20,7 @@
     db_name = 'test-dbobj.db'
 
     def setUp(self):
-        homeDir = os.path.join(os.path.dirname(sys.argv[0]), 'db_home')
+        homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
         self.homeDir = homeDir
         try: os.mkdir(homeDir)
         except os.error: pass
diff --git a/Lib/bsddb/test/test_dbshelve.py b/Lib/bsddb/test/test_dbshelve.py
index 374ccd8..1da6546 100644
--- a/Lib/bsddb/test/test_dbshelve.py
+++ b/Lib/bsddb/test/test_dbshelve.py
@@ -242,7 +242,7 @@
 class BasicEnvShelveTestCase(DBShelveTestCase):
     def do_open(self):
         self.homeDir = homeDir = os.path.join(
-            os.path.dirname(sys.argv[0]), 'db_home')
+            tempfile.gettempdir(), 'db_home')
         try: os.mkdir(homeDir)
         except os.error: pass
         self.env = db.DBEnv()
diff --git a/Lib/bsddb/test/test_dbtables.py b/Lib/bsddb/test/test_dbtables.py
index 2ff93a3..a31fcec 100644
--- a/Lib/bsddb/test/test_dbtables.py
+++ b/Lib/bsddb/test/test_dbtables.py
@@ -26,6 +26,7 @@
     pickle = cPickle
 except ImportError:
     import pickle
+import tempfile
 
 import unittest
 from .test_all import verbose
@@ -46,7 +47,7 @@
     db_name = 'test-table.db'
 
     def setUp(self):
-        homeDir = os.path.join(os.path.dirname(sys.argv[0]), 'db_home')
+        homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
         self.homeDir = homeDir
         try: os.mkdir(homeDir)
         except os.error: pass
diff --git a/Lib/bsddb/test/test_env_close.py b/Lib/bsddb/test/test_env_close.py
index 43dcabe..12e1037 100644
--- a/Lib/bsddb/test/test_env_close.py
+++ b/Lib/bsddb/test/test_env_close.py
@@ -33,7 +33,7 @@
 
 class DBEnvClosedEarlyCrash(unittest.TestCase):
     def setUp(self):
-        self.homeDir = os.path.join(os.path.dirname(sys.argv[0]), 'db_home')
+        self.homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
         try: os.mkdir(self.homeDir)
         except os.error: pass
         tempfile.tempdir = self.homeDir
diff --git a/Lib/bsddb/test/test_join.py b/Lib/bsddb/test/test_join.py
index 6e98b0b..5e307ae 100644
--- a/Lib/bsddb/test/test_join.py
+++ b/Lib/bsddb/test/test_join.py
@@ -49,7 +49,7 @@
 
     def setUp(self):
         self.filename = self.__class__.__name__ + '.db'
-        homeDir = os.path.join(os.path.dirname(sys.argv[0]), 'db_home')
+        homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
         self.homeDir = homeDir
         try: os.mkdir(homeDir)
         except os.error: pass
diff --git a/Lib/bsddb/test/test_lock.py b/Lib/bsddb/test/test_lock.py
index 53f11a8..61bdae8 100644
--- a/Lib/bsddb/test/test_lock.py
+++ b/Lib/bsddb/test/test_lock.py
@@ -30,7 +30,7 @@
 class LockingTestCase(unittest.TestCase):
 
     def setUp(self):
-        homeDir = os.path.join(os.path.dirname(sys.argv[0]), 'db_home')
+        homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
         self.homeDir = homeDir
         try: os.mkdir(homeDir)
         except os.error: pass
diff --git a/Lib/bsddb/test/test_misc.py b/Lib/bsddb/test/test_misc.py
index 88f700b..6b2df07 100644
--- a/Lib/bsddb/test/test_misc.py
+++ b/Lib/bsddb/test/test_misc.py
@@ -4,6 +4,7 @@
 import os
 import sys
 import unittest
+import tempfile
 
 try:
     # For Pythons w/distutils pybsddb
@@ -17,7 +18,7 @@
 class MiscTestCase(unittest.TestCase):
     def setUp(self):
         self.filename = self.__class__.__name__ + '.db'
-        homeDir = os.path.join(os.path.dirname(sys.argv[0]), 'db_home')
+        homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
         self.homeDir = homeDir
         try:
             os.mkdir(homeDir)
diff --git a/Lib/bsddb/test/test_recno.py b/Lib/bsddb/test/test_recno.py
index e325aac..35399b5 100644
--- a/Lib/bsddb/test/test_recno.py
+++ b/Lib/bsddb/test/test_recno.py
@@ -203,10 +203,10 @@
         just a line in the file, but you can set a different record delimiter
         if needed.
         """
-        source = os.path.join(os.path.dirname(sys.argv[0]),
-                              'db_home/test_recno.txt')
-        if not os.path.isdir('db_home'):
-            os.mkdir('db_home')
+        homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
+        source = os.path.join(homeDir, 'test_recno.txt')
+        if not os.path.isdir(homeDir):
+            os.mkdir(homeDir)
         f = open(source, 'w') # create the file
         f.close()
 
diff --git a/Lib/bsddb/test/test_thread.py b/Lib/bsddb/test/test_thread.py
index 6942aa2..bf19d21 100644
--- a/Lib/bsddb/test/test_thread.py
+++ b/Lib/bsddb/test/test_thread.py
@@ -53,7 +53,7 @@
         if verbose:
             dbutils._deadlock_VerboseFile = sys.stdout
 
-        homeDir = os.path.join(os.path.dirname(sys.argv[0]), 'db_home')
+        homeDir = os.path.join(tempfile.gettempdir(), 'db_home')
         self.homeDir = homeDir
         try:
             os.mkdir(homeDir)
diff --git a/Lib/cookielib.py b/Lib/cookielib.py
index e8fee0e..ce037b0 100644
--- a/Lib/cookielib.py
+++ b/Lib/cookielib.py
@@ -1316,26 +1316,28 @@
         """
         _debug("add_cookie_header")
         self._cookies_lock.acquire()
+        try:
 
-        self._policy._now = self._now = int(time.time())
+           self._policy._now = self._now = int(time.time())
 
-        cookies = self._cookies_for_request(request)
+           cookies = self._cookies_for_request(request)
 
-        attrs = self._cookie_attrs(cookies)
-        if attrs:
-            if not request.has_header("Cookie"):
-                request.add_unredirected_header(
-                    "Cookie", "; ".join(attrs))
+           attrs = self._cookie_attrs(cookies)
+           if attrs:
+               if not request.has_header("Cookie"):
+                   request.add_unredirected_header(
+                       "Cookie", "; ".join(attrs))
 
-        # if necessary, advertise that we know RFC 2965
-        if (self._policy.rfc2965 and not self._policy.hide_cookie2 and
-            not request.has_header("Cookie2")):
-            for cookie in cookies:
-                if cookie.version != 1:
-                    request.add_unredirected_header("Cookie2", '$Version="1"')
-                    break
-
-        self._cookies_lock.release()
+           # if necessary, advertise that we know RFC 2965
+           if (self._policy.rfc2965 and not self._policy.hide_cookie2 and
+               not request.has_header("Cookie2")):
+               for cookie in cookies:
+                   if cookie.version != 1:
+                       request.add_unredirected_header("Cookie2", '$Version="1"')
+                       break
+   
+        finally:
+           self._cookies_lock.release()
 
         self.clear_expired_cookies()
 
@@ -1602,12 +1604,15 @@
     def set_cookie_if_ok(self, cookie, request):
         """Set a cookie if policy says it's OK to do so."""
         self._cookies_lock.acquire()
-        self._policy._now = self._now = int(time.time())
+        try:
+            self._policy._now = self._now = int(time.time())
 
-        if self._policy.set_ok(cookie, request):
-            self.set_cookie(cookie)
+            if self._policy.set_ok(cookie, request):
+                self.set_cookie(cookie)
+         
 
-        self._cookies_lock.release()
+        finally:
+            self._cookies_lock.release()
 
     def set_cookie(self, cookie):
         """Set a cookie, without checking whether or not it should be set."""
@@ -1626,13 +1631,15 @@
         """Extract cookies from response, where allowable given the request."""
         _debug("extract_cookies: %s", response.info())
         self._cookies_lock.acquire()
-        self._policy._now = self._now = int(time.time())
+        try:
+           self._policy._now = self._now = int(time.time())
 
-        for cookie in self.make_cookies(response, request):
-            if self._policy.set_ok(cookie, request):
-                _debug(" setting cookie: %s", cookie)
-                self.set_cookie(cookie)
-        self._cookies_lock.release()
+           for cookie in self.make_cookies(response, request):
+               if self._policy.set_ok(cookie, request):
+                   _debug(" setting cookie: %s", cookie)
+                   self.set_cookie(cookie)
+        finally:
+           self._cookies_lock.release()
 
     def clear(self, domain=None, path=None, name=None):
         """Clear some cookies.
@@ -1669,10 +1676,12 @@
 
         """
         self._cookies_lock.acquire()
-        for cookie in self:
-            if cookie.discard:
-                self.clear(cookie.domain, cookie.path, cookie.name)
-        self._cookies_lock.release()
+        try:
+           for cookie in self:
+               if cookie.discard:
+                   self.clear(cookie.domain, cookie.path, cookie.name)
+        finally:
+           self._cookies_lock.release()
 
     def clear_expired_cookies(self):
         """Discard all expired cookies.
@@ -1685,11 +1694,13 @@
 
         """
         self._cookies_lock.acquire()
-        now = time.time()
-        for cookie in self:
-            if cookie.is_expired(now):
-                self.clear(cookie.domain, cookie.path, cookie.name)
-        self._cookies_lock.release()
+        try:
+           now = time.time()
+           for cookie in self:
+               if cookie.is_expired(now):
+                   self.clear(cookie.domain, cookie.path, cookie.name)
+        finally:
+           self._cookies_lock.release()
 
     def __iter__(self):
         return deepvalues(self._cookies)
@@ -1761,16 +1772,18 @@
             else: raise ValueError(MISSING_FILENAME_TEXT)
 
         self._cookies_lock.acquire()
-
-        old_state = copy.deepcopy(self._cookies)
-        self._cookies = {}
         try:
-            self.load(filename, ignore_discard, ignore_expires)
-        except (LoadError, IOError):
-            self._cookies = old_state
-            raise
 
-        self._cookies_lock.release()
+           old_state = copy.deepcopy(self._cookies)
+           self._cookies = {}
+           try:
+               self.load(filename, ignore_discard, ignore_expires)
+           except (LoadError, IOError):
+               self._cookies = old_state
+               raise
+
+        finally:
+           self._cookies_lock.release()
 
 from _LWPCookieJar import LWPCookieJar, lwp_cookie_str
 from _MozillaCookieJar import MozillaCookieJar
diff --git a/Lib/difflib.py b/Lib/difflib.py
index 408079b..831840d 100644
--- a/Lib/difflib.py
+++ b/Lib/difflib.py
@@ -1310,7 +1310,7 @@
 
 def _mdiff(fromlines, tolines, context=None, linejunk=None,
            charjunk=IS_CHARACTER_JUNK):
-    """Returns generator yielding marked up from/to side by side differences.
+    r"""Returns generator yielding marked up from/to side by side differences.
 
     Arguments:
     fromlines -- list of text lines to compared to tolines
diff --git a/Lib/dumbdbm.py b/Lib/dumbdbm.py
index e00d9e8..ee2f39e 100644
--- a/Lib/dumbdbm.py
+++ b/Lib/dumbdbm.py
@@ -68,7 +68,8 @@
         try:
             f = _open(self._datfile, 'r')
         except IOError:
-            f = _open(self._datfile, 'w', self._mode)
+            f = _open(self._datfile, 'w')
+            self._chmod(self._datfile)
         f.close()
         self._update()
 
@@ -106,7 +107,8 @@
         except self._os.error:
             pass
 
-        f = self._open(self._dirfile, 'w', self._mode)
+        f = self._open(self._dirfile, 'w')
+        self._chmod(self._dirfile)
         for key, pos_and_siz_pair in self._index.iteritems():
             f.write("%r, %r\n" % (key, pos_and_siz_pair))
         f.close()
@@ -152,7 +154,8 @@
     # the in-memory index dict, and append one to the directory file.
     def _addkey(self, key, pos_and_siz_pair):
         self._index[key] = pos_and_siz_pair
-        f = _open(self._dirfile, 'a', self._mode)
+        f = _open(self._dirfile, 'a')
+        self._chmod(self._dirfile)
         f.write("%r, %r\n" % (key, pos_and_siz_pair))
         f.close()
 
@@ -211,6 +214,9 @@
 
     __del__ = close
 
+    def _chmod (self, file):
+        if hasattr(self._os, 'chmod'):
+            self._os.chmod(file, self._mode)
 
 
 def open(file, flag=None, mode=0666):
@@ -227,4 +233,15 @@
 
     """
     # flag argument is currently ignored
+
+    # Modify mode depending on the umask
+    try:
+        um = _os.umask(0)
+        _os.umask(um)
+    except AttributeError:
+        pass
+    else:
+        # Turn off any bits that are set in the umask
+        mode = mode & (~um)
+        
     return _Database(file, mode)
diff --git a/Lib/heapq.py b/Lib/heapq.py
index 04725cd..753c3b7 100644
--- a/Lib/heapq.py
+++ b/Lib/heapq.py
@@ -130,7 +130,7 @@
            'nsmallest']
 
 from itertools import islice, repeat, count, imap, izip, tee
-from operator import itemgetter
+from operator import itemgetter, neg
 import bisect
 
 def heappush(heap, item):
@@ -315,8 +315,6 @@
 
     Equivalent to:  sorted(iterable, key=key)[:n]
     """
-    if key is None:
-        return _nsmallest(n, iterable)
     in1, in2 = tee(iterable)
     it = izip(imap(key, in1), count(), in2)                 # decorate
     result = _nsmallest(n, it)
@@ -328,10 +326,8 @@
 
     Equivalent to:  sorted(iterable, key=key, reverse=True)[:n]
     """
-    if key is None:
-        return _nlargest(n, iterable)
     in1, in2 = tee(iterable)
-    it = izip(imap(key, in1), count(), in2)                 # decorate
+    it = izip(imap(key, in1), imap(neg, count()), in2)      # decorate
     result = _nlargest(n, it)
     return map(itemgetter(2), result)                       # undecorate
 
diff --git a/Lib/hmac.py b/Lib/hmac.py
index 41d6c6c..88c3fd5 100644
--- a/Lib/hmac.py
+++ b/Lib/hmac.py
@@ -3,13 +3,11 @@
 Implements the HMAC algorithm as described by RFC 2104.
 """
 
-def _strxor(s1, s2):
-    """Utility method. XOR the two strings s1 and s2 (must have same length).
-    """
-    return "".join(map(lambda x, y: chr(ord(x) ^ ord(y)), s1, s2))
+trans_5C = "".join ([chr (x ^ 0x5C) for x in xrange(256)])
+trans_36 = "".join ([chr (x ^ 0x36) for x in xrange(256)])
 
 # The size of the digests returned by HMAC depends on the underlying
-# hashing module used.
+# hashing module used.  Use digest_size from the instance of HMAC instead.
 digest_size = None
 
 # A unique object passed by HMAC.copy() to the HMAC constructor, in order
@@ -22,6 +20,7 @@
 
     This supports the API for Cryptographic Hash Functions (PEP 247).
     """
+    blocksize = 64  # 512-bit HMAC; can be changed in subclasses.
 
     def __init__(self, key, msg = None, digestmod = None):
         """Create a new HMAC object.
@@ -49,16 +48,13 @@
         self.inner = self.digest_cons()
         self.digest_size = self.inner.digest_size
 
-        blocksize = 64
-        ipad = "\x36" * blocksize
-        opad = "\x5C" * blocksize
-
+        blocksize = self.blocksize
         if len(key) > blocksize:
             key = self.digest_cons(key).digest()
 
         key = key + chr(0) * (blocksize - len(key))
-        self.outer.update(_strxor(key, opad))
-        self.inner.update(_strxor(key, ipad))
+        self.outer.update(key.translate(trans_5C))
+        self.inner.update(key.translate(trans_36))
         if msg is not None:
             self.update(msg)
 
@@ -75,13 +71,22 @@
 
         An update to this copy won't affect the original object.
         """
-        other = HMAC(_secret_backdoor_key)
+        other = self.__class__(_secret_backdoor_key)
         other.digest_cons = self.digest_cons
         other.digest_size = self.digest_size
         other.inner = self.inner.copy()
         other.outer = self.outer.copy()
         return other
 
+    def _current(self):
+        """Return a hash object for the current state.
+
+        To be used only internally with digest() and hexdigest().
+        """
+        h = self.outer.copy()
+        h.update(self.inner.digest())
+        return h
+
     def digest(self):
         """Return the hash value of this hashing object.
 
@@ -89,15 +94,14 @@
         not altered in any way by this function; you can continue
         updating the object after calling this function.
         """
-        h = self.outer.copy()
-        h.update(self.inner.digest())
+        h = self._current()
         return h.digest()
 
     def hexdigest(self):
         """Like digest(), but returns a string of hexadecimal digits instead.
         """
-        return "".join([hex(ord(x))[2:].zfill(2)
-                        for x in tuple(self.digest())])
+        h = self._current()
+        return h.hexdigest()
 
 def new(key, msg = None, digestmod = None):
     """Create a new hashing object and return it.
diff --git a/Lib/idlelib/AutoCompleteWindow.py b/Lib/idlelib/AutoCompleteWindow.py
index 8bed034..7f8adaf 100644
--- a/Lib/idlelib/AutoCompleteWindow.py
+++ b/Lib/idlelib/AutoCompleteWindow.py
@@ -118,8 +118,11 @@
             i = 0
             while i < len(lts) and i < len(selstart) and lts[i] == selstart[i]:
                 i += 1
-            while cursel > 0 and selstart[:i] <= self.completions[cursel-1]:
+            previous_completion = self.completions[cursel - 1]
+            while cursel > 0 and selstart[:i] <= previous_completion:
                 i += 1
+                if selstart == previous_completion:
+                    break  # maybe we have a duplicate?
             newstart = selstart[:i]
         self._change_start(newstart)
 
diff --git a/Lib/idlelib/NEWS.txt b/Lib/idlelib/NEWS.txt
index 43e5b45..5f73a69 100644
--- a/Lib/idlelib/NEWS.txt
+++ b/Lib/idlelib/NEWS.txt
@@ -3,9 +3,14 @@
 
 *Release date: XX-XXX-200X*
 
+- Avoid hang when encountering a duplicate in a completion list. Bug 1571112.
+
 - Patch #1362975: Rework CodeContext indentation algorithm to
   avoid hard-coding pixel widths.
 
+- Bug #813342: Start the IDLE subprocess with -Qnew if the parent
+  is started with that option.
+
 - Some syntax errors were being caught by tokenize during the tabnanny
   check, resulting in obscure error messages.  Do the syntax check
   first.  Bug 1562716, 1562719
@@ -14,6 +19,12 @@
   the Python release of which it's a part.
 
 
+What's New in IDLE 1.2?
+=======================
+
+*Release date: 19-SEP-2006*
+
+
 What's New in IDLE 1.2c1?
 =========================
 
@@ -44,6 +55,13 @@
 
 *Release date: 03-AUG-2006*
 
+- Bug #1525817: Don't truncate short lines in IDLE's tool tips.
+
+- Bug #1517990: IDLE keybindings on MacOS X now work correctly
+
+- Bug #1517996: IDLE now longer shows the default Tk menu when a
+  path browser, class browser or debugger is the frontmost window on MacOS X
+
 - EditorWindow.test() was failing.  Bug 1417598
 
 - EditorWindow failed when used stand-alone if sys.ps1 not set.
@@ -80,6 +98,8 @@
 
 *Release date: 05-APR-2006*
 
+- Patch #1162825: Support non-ASCII characters in IDLE window titles.
+
 - Source file f.flush() after writing; trying to avoid lossage if user
   kills GUI.
 
diff --git a/Lib/logging/__init__.py b/Lib/logging/__init__.py
index b57a9af..615dcc7 100644
--- a/Lib/logging/__init__.py
+++ b/Lib/logging/__init__.py
@@ -41,8 +41,8 @@
 
 __author__  = "Vinay Sajip <vinay_sajip@red-dove.com>"
 __status__  = "production"
-__version__ = "0.4.9.9"
-__date__    = "06 February 2006"
+__version__ = "0.5.0.0"
+__date__    = "08 January 2007"
 
 #---------------------------------------------------------------------------
 #   Miscellaneous module data
@@ -243,7 +243,7 @@
         try:
             self.filename = os.path.basename(pathname)
             self.module = os.path.splitext(self.filename)[0]
-        except:
+        except (TypeError, ValueError, AttributeError):
             self.filename = pathname
             self.module = "Unknown module"
         self.exc_info = exc_info
diff --git a/Lib/logging/handlers.py b/Lib/logging/handlers.py
index 17eca8a..82896ad 100644
--- a/Lib/logging/handlers.py
+++ b/Lib/logging/handlers.py
@@ -347,7 +347,7 @@
             try:
                 self.sock = self.makeSocket()
                 self.retryTime = None # next time, no delay before trying
-            except:
+            except socket.error:
                 #Creation failed, so set the retry time and return.
                 if self.retryTime is None:
                     self.retryPeriod = self.retryStart
@@ -738,7 +738,7 @@
             import smtplib
             try:
                 from email.Utils import formatdate
-            except:
+            except ImportError:
                 formatdate = self.date_time
             port = self.mailport
             if not port:
diff --git a/Lib/mailbox.py b/Lib/mailbox.py
index 0843430..8d1df4b 100755
--- a/Lib/mailbox.py
+++ b/Lib/mailbox.py
@@ -510,6 +510,7 @@
         self._next_key = 0
         self._pending = False   # No changes require rewriting the file.
         self._locked = False
+        self._file_length = None        # Used to record mailbox size
 
     def add(self, message):
         """Add message and return assigned key."""
@@ -563,7 +564,21 @@
         """Write any pending changes to disk."""
         if not self._pending:
             return
-        self._lookup()
+
+        # In order to be writing anything out at all, self._toc must
+        # already have been generated (and presumably has been modified
+        # by adding or deleting an item).
+        assert self._toc is not None
+        
+        # Check length of self._file; if it's changed, some other process
+        # has modified the mailbox since we scanned it.
+        self._file.seek(0, 2)
+        cur_len = self._file.tell()
+        if cur_len != self._file_length:
+            raise ExternalClashError('Size of mailbox file changed '
+                                     '(expected %i, found %i)' %
+                                     (self._file_length, cur_len))
+        
         new_file = _create_temporary(self._path)
         try:
             new_toc = {}
@@ -639,6 +654,7 @@
         offsets = self._install_message(message)
         self._post_message_hook(self._file)
         self._file.flush()
+        self._file_length = self._file.tell()  # Record current length of mailbox
         return offsets
 
 
@@ -730,6 +746,7 @@
                 break
         self._toc = dict(enumerate(zip(starts, stops)))
         self._next_key = len(self._toc)
+        self._file_length = self._file.tell()
 
 
 class MMDF(_mboxMMDF):
@@ -773,6 +790,8 @@
                 break
         self._toc = dict(enumerate(zip(starts, stops)))
         self._next_key = len(self._toc)
+        self._file.seek(0, 2)
+        self._file_length = self._file.tell()
 
 
 class MH(Mailbox):
@@ -1198,7 +1217,9 @@
         self._toc = dict(enumerate(zip(starts, stops)))
         self._labels = dict(enumerate(label_lists))
         self._next_key = len(self._toc)
-
+        self._file.seek(0, 2)
+        self._file_length = self._file.tell()
+        
     def _pre_mailbox_hook(self, f):
         """Called before writing the mailbox to file f."""
         f.write('BABYL OPTIONS:%sVersion: 5%sLabels:%s%s\037' %
@@ -1884,7 +1905,8 @@
 def _sync_flush(f):
     """Ensure changes to file f are physically on disk."""
     f.flush()
-    os.fsync(f.fileno())
+    if hasattr(os, 'fsync'):
+        os.fsync(f.fileno())
 
 def _sync_close(f):
     """Close file f, ensuring all changes are physically on disk."""
diff --git a/Lib/pty.py b/Lib/pty.py
index 889113c..d3eb64f 100644
--- a/Lib/pty.py
+++ b/Lib/pty.py
@@ -121,7 +121,9 @@
         # Explicitly open the tty to make it become a controlling tty.
         tmp_fd = os.open(os.ttyname(STDOUT_FILENO), os.O_RDWR)
         os.close(tmp_fd)
-
+    else:
+        os.close(slave_fd)
+        
     # Parent and child process.
     return pid, master_fd
 
diff --git a/Lib/pydoc.py b/Lib/pydoc.py
index 3459fbe..d1ab01c 100755
--- a/Lib/pydoc.py
+++ b/Lib/pydoc.py
@@ -1741,6 +1741,9 @@
 Sorry, topic and keyword documentation is not available because the Python
 HTML documentation files could not be found.  If you have installed them,
 please set the environment variable PYTHONDOCS to indicate their location.
+
+On the Microsoft Windows operating system, the files can be built by
+running "hh -decompile . PythonNN.chm" in the C:\PythonNN\Doc> directory.
 ''')
             return
         target = self.topics.get(topic, self.keywords.get(topic))
diff --git a/Lib/random.py b/Lib/random.py
index ae2d434..b80f1a1 100644
--- a/Lib/random.py
+++ b/Lib/random.py
@@ -205,7 +205,7 @@
             raise ValueError, "empty range for randrange()"
 
         if n >= maxwidth:
-            return istart + self._randbelow(n)
+            return istart + istep*self._randbelow(n)
         return istart + istep*int(self.random() * n)
 
     def randint(self, a, b):
diff --git a/Lib/sre_parse.py b/Lib/sre_parse.py
index 319bf43..e63f2ac 100644
--- a/Lib/sre_parse.py
+++ b/Lib/sre_parse.py
@@ -134,6 +134,8 @@
     def __delitem__(self, index):
         del self.data[index]
     def __getitem__(self, index):
+        if isinstance(index, slice):
+            return SubPattern(self.pattern, self.data[index])
         return self.data[index]
     def __setitem__(self, index, code):
         self.data[index] = code
diff --git a/Lib/subprocess.py b/Lib/subprocess.py
index 68ab05e..62b70ba 100644
--- a/Lib/subprocess.py
+++ b/Lib/subprocess.py
@@ -166,7 +166,7 @@
 communicate(input=None)
     Interact with process: Send data to stdin.  Read data from stdout
     and stderr, until end-of-file is reached.  Wait for process to
-    terminate.  The optional stdin argument should be a string to be
+    terminate.  The optional input argument should be a string to be
     sent to the child process, or None, if no data should be sent to
     the child.
 
@@ -1005,8 +1005,12 @@
 
                     # Close pipe fds.  Make sure we don't close the same
                     # fd more than once, or standard fds.
-                    for fd in set((p2cread, c2pwrite, errwrite))-set((0,1,2)):
-                        if fd: os.close(fd)
+                    if p2cread and p2cread not in (0,):
+                        os.close(p2cread)
+                    if c2pwrite and c2pwrite not in (p2cread, 1):
+                        os.close(c2pwrite)
+                    if errwrite and errwrite not in (p2cread, c2pwrite, 2):
+                        os.close(errwrite)
 
                     # Close all other fds, if asked for
                     if close_fds:
@@ -1108,6 +1112,7 @@
                 read_set.append(self.stderr)
                 stderr = []
 
+            input_offset = 0
             while read_set or write_set:
                 rlist, wlist, xlist = select.select(read_set, write_set, [])
 
@@ -1115,9 +1120,9 @@
                     # When select has indicated that the file is writable,
                     # we can write up to PIPE_BUF bytes without risk
                     # blocking.  POSIX defines PIPE_BUF >= 512
-                    bytes_written = os.write(self.stdin.fileno(), input[:512])
-                    input = input[bytes_written:]
-                    if not input:
+                    bytes_written = os.write(self.stdin.fileno(), buffer(input, input_offset, 512))
+                    input_offset += bytes_written 
+                    if input_offset >= len(input):
                         self.stdin.close()
                         write_set.remove(self.stdin)
 
diff --git a/Lib/tarfile.py b/Lib/tarfile.py
index 14553a7..3ffdff3 100644
--- a/Lib/tarfile.py
+++ b/Lib/tarfile.py
@@ -147,7 +147,10 @@
     # There are two possible encodings for a number field, see
     # itn() below.
     if s[0] != chr(0200):
-        n = int(s.rstrip(NUL + " ") or "0", 8)
+        try:
+            n = int(s.rstrip(NUL + " ") or "0", 8)
+        except ValueError:
+            raise HeaderError("invalid header")
     else:
         n = 0L
         for i in xrange(len(s) - 1):
@@ -282,6 +285,9 @@
 class StreamError(TarError):
     """Exception for unsupported operations on stream-like TarFiles."""
     pass
+class HeaderError(TarError):
+    """Exception for invalid headers."""
+    pass
 
 #---------------------------
 # internal stream interface
@@ -624,64 +630,158 @@
 #------------------------
 # Extraction file object
 #------------------------
-class ExFileObject(object):
-    """File-like object for reading an archive member.
-       Is returned by TarFile.extractfile(). Support for
-       sparse files included.
+class _FileInFile(object):
+    """A thin wrapper around an existing file object that
+       provides a part of its data as an individual file
+       object.
     """
 
-    def __init__(self, tarfile, tarinfo):
-        self.fileobj = tarfile.fileobj
-        self.name    = tarinfo.name
-        self.mode    = "r"
-        self.closed  = False
-        self.offset  = tarinfo.offset_data
-        self.size    = tarinfo.size
-        self.pos     = 0L
-        self.linebuffer = ""
-        if tarinfo.issparse():
-            self.sparse = tarinfo.sparse
-            self.read = self._readsparse
-        else:
-            self.read = self._readnormal
+    def __init__(self, fileobj, offset, size, sparse=None):
+        self.fileobj = fileobj
+        self.offset = offset
+        self.size = size
+        self.sparse = sparse
+        self.position = 0
 
-    def __read(self, size):
-        """Overloadable read method.
+    def tell(self):
+        """Return the current file position.
         """
+        return self.position
+
+    def seek(self, position):
+        """Seek to a position in the file.
+        """
+        self.position = position
+
+    def read(self, size=None):
+        """Read data from the file.
+        """
+        if size is None:
+            size = self.size - self.position
+        else:
+            size = min(size, self.size - self.position)
+
+        if self.sparse is None:
+            return self.readnormal(size)
+        else:
+            return self.readsparse(size)
+
+    def readnormal(self, size):
+        """Read operation for regular files.
+        """
+        self.fileobj.seek(self.offset + self.position)
+        self.position += size
         return self.fileobj.read(size)
 
-    def readline(self, size=-1):
-        """Read a line with approx. size. If size is negative,
-           read a whole line. readline() and read() must not
-           be mixed up (!).
+    def readsparse(self, size):
+        """Read operation for sparse files.
         """
-        if size < 0:
-            size = sys.maxint
+        data = []
+        while size > 0:
+            buf = self.readsparsesection(size)
+            if not buf:
+                break
+            size -= len(buf)
+            data.append(buf)
+        return "".join(data)
 
-        nl = self.linebuffer.find("\n")
-        if nl >= 0:
-            nl = min(nl, size)
+    def readsparsesection(self, size):
+        """Read a single section of a sparse file.
+        """
+        section = self.sparse.find(self.position)
+
+        if section is None:
+            return ""
+
+        size = min(size, section.offset + section.size - self.position)
+
+        if isinstance(section, _data):
+            realpos = section.realpos + self.position - section.offset
+            self.fileobj.seek(self.offset + realpos)
+            self.position += size
+            return self.fileobj.read(size)
         else:
-            size -= len(self.linebuffer)
-            while (nl < 0 and size > 0):
-                buf = self.read(min(size, 100))
-                if not buf:
+            self.position += size
+            return NUL * size
+#class _FileInFile
+
+
+class ExFileObject(object):
+    """File-like object for reading an archive member.
+       Is returned by TarFile.extractfile().
+    """
+    blocksize = 1024
+
+    def __init__(self, tarfile, tarinfo):
+        self.fileobj = _FileInFile(tarfile.fileobj,
+                                   tarinfo.offset_data,
+                                   tarinfo.size,
+                                   getattr(tarinfo, "sparse", None))
+        self.name = tarinfo.name
+        self.mode = "r"
+        self.closed = False
+        self.size = tarinfo.size
+
+        self.position = 0
+        self.buffer = ""
+
+    def read(self, size=None):
+        """Read at most size bytes from the file. If size is not
+           present or None, read all data until EOF is reached.
+        """
+        if self.closed:
+            raise ValueError("I/O operation on closed file")
+
+        buf = ""
+        if self.buffer:
+            if size is None:
+                buf = self.buffer
+                self.buffer = ""
+            else:
+                buf = self.buffer[:size]
+                self.buffer = self.buffer[size:]
+
+        if size is None:
+            buf += self.fileobj.read()
+        else:
+            buf += self.fileobj.read(size - len(buf))
+
+        self.position += len(buf)
+        return buf
+
+    def readline(self, size=-1):
+        """Read one entire line from the file. If size is present
+           and non-negative, return a string with at most that
+           size, which may be an incomplete line.
+        """
+        if self.closed:
+            raise ValueError("I/O operation on closed file")
+
+        if "\n" in self.buffer:
+            pos = self.buffer.find("\n") + 1
+        else:
+            buffers = [self.buffer]
+            while True:
+                buf = self.fileobj.read(self.blocksize)
+                buffers.append(buf)
+                if not buf or "\n" in buf:
+                    self.buffer = "".join(buffers)
+                    pos = self.buffer.find("\n") + 1
+                    if pos == 0:
+                        # no newline found.
+                        pos = len(self.buffer)
                     break
-                self.linebuffer += buf
-                size -= len(buf)
-                nl = self.linebuffer.find("\n")
-            if nl == -1:
-                s = self.linebuffer
-                self.linebuffer = ""
-                return s
-        buf = self.linebuffer[:nl]
-        self.linebuffer = self.linebuffer[nl + 1:]
-        while buf[-1:] == "\r":
-            buf = buf[:-1]
-        return buf + "\n"
+
+        if size != -1:
+            pos = min(size, pos)
+
+        buf = self.buffer[:pos]
+        self.buffer = self.buffer[pos:]
+        self.position += len(buf)
+        return buf
 
     def readlines(self):
-        """Return a list with all (following) lines.
+        """Return a list with all remaining lines.
         """
         result = []
         while True:
@@ -690,74 +790,34 @@
             result.append(line)
         return result
 
-    def _readnormal(self, size=None):
-        """Read operation for regular files.
-        """
-        if self.closed:
-            raise ValueError("file is closed")
-        self.fileobj.seek(self.offset + self.pos)
-        bytesleft = self.size - self.pos
-        if size is None:
-            bytestoread = bytesleft
-        else:
-            bytestoread = min(size, bytesleft)
-        self.pos += bytestoread
-        return self.__read(bytestoread)
-
-    def _readsparse(self, size=None):
-        """Read operation for sparse files.
-        """
-        if self.closed:
-            raise ValueError("file is closed")
-
-        if size is None:
-            size = self.size - self.pos
-
-        data = []
-        while size > 0:
-            buf = self._readsparsesection(size)
-            if not buf:
-                break
-            size -= len(buf)
-            data.append(buf)
-        return "".join(data)
-
-    def _readsparsesection(self, size):
-        """Read a single section of a sparse file.
-        """
-        section = self.sparse.find(self.pos)
-
-        if section is None:
-            return ""
-
-        toread = min(size, section.offset + section.size - self.pos)
-        if isinstance(section, _data):
-            realpos = section.realpos + self.pos - section.offset
-            self.pos += toread
-            self.fileobj.seek(self.offset + realpos)
-            return self.__read(toread)
-        else:
-            self.pos += toread
-            return NUL * toread
-
     def tell(self):
         """Return the current file position.
         """
-        return self.pos
+        if self.closed:
+            raise ValueError("I/O operation on closed file")
 
-    def seek(self, pos, whence=0):
+        return self.position
+
+    def seek(self, pos, whence=os.SEEK_SET):
         """Seek to a position in the file.
         """
-        self.linebuffer = ""
-        if whence == 0:
-            self.pos = min(max(pos, 0), self.size)
-        if whence == 1:
+        if self.closed:
+            raise ValueError("I/O operation on closed file")
+
+        if whence == os.SEEK_SET:
+            self.position = min(max(pos, 0), self.size)
+        elif whence == os.SEEK_CUR:
             if pos < 0:
-                self.pos = max(self.pos + pos, 0)
+                self.position = max(self.position + pos, 0)
             else:
-                self.pos = min(self.pos + pos, self.size)
-        if whence == 2:
-            self.pos = max(min(self.size + pos, self.size), 0)
+                self.position = min(self.position + pos, self.size)
+        elif whence == os.SEEK_END:
+            self.position = max(min(self.size + pos, self.size), 0)
+        else:
+            raise ValueError("Invalid argument")
+
+        self.buffer = ""
+        self.fileobj.seek(self.position)
 
     def close(self):
         """Close the file object.
@@ -765,20 +825,13 @@
         self.closed = True
 
     def __iter__(self):
-        """Get an iterator over the file object.
+        """Get an iterator over the file's lines.
         """
-        if self.closed:
-            raise ValueError("I/O operation on closed file")
-        return self
-
-    def next(self):
-        """Get the next item from the file iterator.
-        """
-        result = self.readline()
-        if not result:
-            raise StopIteration
-        return result
-
+        while True:
+            line = self.readline()
+            if not line:
+                break
+            yield line
 #class ExFileObject
 
 #------------------
@@ -821,9 +874,13 @@
         """Construct a TarInfo object from a 512 byte string buffer.
         """
         if len(buf) != BLOCKSIZE:
-            raise ValueError("truncated header")
+            raise HeaderError("truncated header")
         if buf.count(NUL) == BLOCKSIZE:
-            raise ValueError("empty header")
+            raise HeaderError("empty header")
+
+        chksum = nti(buf[148:156])
+        if chksum not in calc_chksums(buf):
+            raise HeaderError("bad checksum")
 
         tarinfo = cls()
         tarinfo.buf = buf
@@ -833,7 +890,7 @@
         tarinfo.gid = nti(buf[116:124])
         tarinfo.size = nti(buf[124:136])
         tarinfo.mtime = nti(buf[136:148])
-        tarinfo.chksum = nti(buf[148:156])
+        tarinfo.chksum = chksum
         tarinfo.type = buf[156:157]
         tarinfo.linkname = buf[157:257].rstrip(NUL)
         tarinfo.uname = buf[265:297].rstrip(NUL)
@@ -845,8 +902,6 @@
         if prefix and not tarinfo.issparse():
             tarinfo.name = prefix + "/" + tarinfo.name
 
-        if tarinfo.chksum not in calc_chksums(buf):
-            raise ValueError("invalid header")
         return tarinfo
 
     def tobuf(self, posix=False):
@@ -999,7 +1054,7 @@
            can be determined, `mode' is overridden by `fileobj's mode.
            `fileobj' is not closed, when TarFile is closed.
         """
-        self.name = name
+        self.name = os.path.abspath(name)
 
         if len(mode) > 1 or mode not in "raw":
             raise ValueError("mode must be 'r', 'a' or 'w'")
@@ -1011,7 +1066,7 @@
             self._extfileobj = False
         else:
             if self.name is None and hasattr(fileobj, "name"):
-                self.name = fileobj.name
+                self.name = os.path.abspath(fileobj.name)
             if hasattr(fileobj, "mode"):
                 self.mode = fileobj.mode
             self._extfileobj = True
@@ -1088,9 +1143,13 @@
             # Find out which *open() is appropriate for opening the file.
             for comptype in cls.OPEN_METH:
                 func = getattr(cls, cls.OPEN_METH[comptype])
+                if fileobj is not None:
+                    saved_pos = fileobj.tell()
                 try:
                     return func(name, "r", fileobj)
                 except (ReadError, CompressionError):
+                    if fileobj is not None:
+                        fileobj.seek(saved_pos)
                     continue
             raise ReadError("file could not be opened successfully")
 
@@ -1147,24 +1206,12 @@
         except (ImportError, AttributeError):
             raise CompressionError("gzip module is not available")
 
-        pre, ext = os.path.splitext(name)
-        pre = os.path.basename(pre)
-        if ext == ".tgz":
-            ext = ".tar"
-        if ext == ".gz":
-            ext = ""
-        tarname = pre + ext
-
         if fileobj is None:
             fileobj = _open(name, mode + "b")
 
-        if mode != "r":
-            name = tarname
-
         try:
-            t = cls.taropen(tarname, mode,
-                gzip.GzipFile(name, mode, compresslevel, fileobj)
-            )
+            t = cls.taropen(name, mode,
+                gzip.GzipFile(name, mode, compresslevel, fileobj))
         except IOError:
             raise ReadError("not a gzip file")
         t._extfileobj = False
@@ -1183,21 +1230,13 @@
         except ImportError:
             raise CompressionError("bz2 module is not available")
 
-        pre, ext = os.path.splitext(name)
-        pre = os.path.basename(pre)
-        if ext == ".tbz2":
-            ext = ".tar"
-        if ext == ".bz2":
-            ext = ""
-        tarname = pre + ext
-
         if fileobj is not None:
             fileobj = _BZ2Proxy(fileobj, mode)
         else:
             fileobj = bz2.BZ2File(name, mode, compresslevel=compresslevel)
 
         try:
-            t = cls.taropen(tarname, mode, fileobj)
+            t = cls.taropen(name, mode, fileobj)
         except IOError:
             raise ReadError("not a bzip2 file")
         t._extfileobj = False
@@ -1402,8 +1441,7 @@
             arcname = name
 
         # Skip if somebody tries to archive the archive...
-        if self.name is not None \
-            and os.path.abspath(name) == os.path.abspath(self.name):
+        if self.name is not None and os.path.abspath(name) == self.name:
             self._dbg(2, "tarfile: Skipped %r" % name)
             return
 
@@ -1795,16 +1833,14 @@
 
                 tarinfo = self.proc_member(tarinfo)
 
-            except ValueError, e:
+            except HeaderError, e:
                 if self.ignore_zeros:
-                    self._dbg(2, "0x%X: empty or invalid block: %s" %
-                              (self.offset, e))
+                    self._dbg(2, "0x%X: %s" % (self.offset, e))
                     self.offset += BLOCKSIZE
                     continue
                 else:
                     if self.offset == 0:
-                        raise ReadError("empty, unreadable or compressed "
-                                        "file: %s" % e)
+                        raise ReadError(str(e))
                     return None
             break
 
diff --git a/Lib/test/test_compile.py b/Lib/test/test_compile.py
index 73ef2d4..b517daa 100644
--- a/Lib/test/test_compile.py
+++ b/Lib/test/test_compile.py
@@ -1,5 +1,4 @@
 import unittest
-import warnings
 import sys
 from test import test_support
 
diff --git a/Lib/test/test_deque.py b/Lib/test/test_deque.py
index 4c5d1ee..56031a7 100644
--- a/Lib/test/test_deque.py
+++ b/Lib/test/test_deque.py
@@ -396,6 +396,12 @@
         d.pop()
         self.assertRaises(RuntimeError, it.next)
 
+    def test_runtime_error_on_empty_deque(self):
+        d = deque()
+        it = iter(d)
+        d.append(10)
+        self.assertRaises(RuntimeError, it.next)
+
 class Deque(deque):
     pass
 
diff --git a/Lib/test/test_dumbdbm.py b/Lib/test/test_dumbdbm.py
index 63b14b0..e5dfe1d 100644
--- a/Lib/test/test_dumbdbm.py
+++ b/Lib/test/test_dumbdbm.py
@@ -38,6 +38,24 @@
         self.read_helper(f)
         f.close()
 
+    def test_dumbdbm_creation_mode(self):
+        # On platforms without chmod, don't do anything.
+        if not (hasattr(os, 'chmod') and hasattr(os, 'umask')):
+            return
+
+        try:
+            old_umask = os.umask(0002)
+            f = dumbdbm.open(_fname, 'c', 0637)
+            f.close()
+        finally:
+            os.umask(old_umask)
+            
+        import stat
+        st = os.stat(_fname + '.dat')
+        self.assertEqual(stat.S_IMODE(st.st_mode), 0635)
+        st = os.stat(_fname + '.dir')
+        self.assertEqual(stat.S_IMODE(st.st_mode), 0635)
+        
     def test_close_twice(self):
         f = dumbdbm.open(_fname)
         f['a'] = 'b'
diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py
index abce41e..7619aae 100644
--- a/Lib/test/test_exceptions.py
+++ b/Lib/test/test_exceptions.py
@@ -3,7 +3,6 @@
 import os
 import sys
 import unittest
-import warnings
 import pickle
 try:
     import cPickle
diff --git a/Lib/test/test_heapq.py b/Lib/test/test_heapq.py
index 1916449..b652d41 100644
--- a/Lib/test/test_heapq.py
+++ b/Lib/test/test_heapq.py
@@ -104,20 +104,20 @@
             self.assertEqual(heap_sorted, sorted(data))
 
     def test_nsmallest(self):
-        data = [random.randrange(2000) for i in range(1000)]
-        f = lambda x:  x * 547 % 2000
-        for n in (0, 1, 2, 10, 100, 400, 999, 1000, 1100):
-            self.assertEqual(nsmallest(n, data), sorted(data)[:n])
-            self.assertEqual(nsmallest(n, data, key=f),
-                             sorted(data, key=f)[:n])
+        data = [(random.randrange(2000), i) for i in range(1000)]
+        for f in (None, lambda x:  x[0] * 547 % 2000):
+            for n in (0, 1, 2, 10, 100, 400, 999, 1000, 1100):
+                self.assertEqual(nsmallest(n, data), sorted(data)[:n])
+                self.assertEqual(nsmallest(n, data, key=f),
+                                 sorted(data, key=f)[:n])
 
     def test_nlargest(self):
-        data = [random.randrange(2000) for i in range(1000)]
-        f = lambda x:  x * 547 % 2000
-        for n in (0, 1, 2, 10, 100, 400, 999, 1000, 1100):
-            self.assertEqual(nlargest(n, data), sorted(data, reverse=True)[:n])
-            self.assertEqual(nlargest(n, data, key=f),
-                             sorted(data, key=f, reverse=True)[:n])
+        data = [(random.randrange(2000), i) for i in range(1000)]
+        for f in (None, lambda x:  x[0] * 547 % 2000):
+            for n in (0, 1, 2, 10, 100, 400, 999, 1000, 1100):
+                self.assertEqual(nlargest(n, data), sorted(data, reverse=True)[:n])
+                self.assertEqual(nlargest(n, data, key=f),
+                                 sorted(data, key=f, reverse=True)[:n])
 
 
 #==============================================================================
diff --git a/Lib/test/test_import.py b/Lib/test/test_import.py
index e37378f..58de944 100644
--- a/Lib/test/test_import.py
+++ b/Lib/test/test_import.py
@@ -1,10 +1,11 @@
-from test.test_support import TESTFN, run_unittest
+from test.test_support import TESTFN, run_unittest, guard_warnings_filter
 
 import unittest
 import os
 import random
 import sys
 import py_compile
+import warnings
 
 
 def remove_files(name):
@@ -204,15 +205,11 @@
         self.assert_(y is test.test_support, y.__name__)
 
     def test_import_initless_directory_warning(self):
-        import warnings
-        oldfilters = warnings.filters[:]
-        warnings.simplefilter('error', ImportWarning);
-        try:
+        with guard_warnings_filter():
             # Just a random non-package directory we always expect to be
             # somewhere in sys.path...
+            warnings.simplefilter('error', ImportWarning)
             self.assertRaises(ImportWarning, __import__, "site-packages")
-        finally:
-            warnings.filters = oldfilters
 
 def test_main(verbose=None):
     run_unittest(ImportTest)
diff --git a/Lib/test/test_pty.py b/Lib/test/test_pty.py
index 59e5162..8a83e39 100644
--- a/Lib/test/test_pty.py
+++ b/Lib/test/test_pty.py
@@ -115,6 +115,12 @@
     os._exit(4)
 else:
     debug("Waiting for child (%d) to finish."%pid)
+    ##line = os.read(master_fd, 80)
+    ##lines = line.replace('\r\n', '\n').split('\n')
+    ##if False and lines != ['In child, calling os.setsid()',
+    ##             'Good: OSError was raised.', '']:
+    ##    raise TestFailed("Unexpected output from child: %r" % line)
+            
     (pid, status) = os.waitpid(pid, 0)
     res = status >> 8
     debug("Child (%d) exited with status %d (%d)."%(pid, res, status))
@@ -127,6 +133,15 @@
     elif res != 4:
         raise TestFailed, "pty.fork() failed for unknown reasons."
 
+    ##debug("Reading from master_fd now that the child has exited")
+    ##try:
+    ##    s1 = os.read(master_fd, 1024)
+    ##except os.error:
+    ##    pass
+    ##else:
+    ##    raise TestFailed("Read from master_fd did not raise exception")
+    
+    
 os.close(master_fd)
 
 # pty.fork() passed.
diff --git a/Lib/test/test_random.py b/Lib/test/test_random.py
index afcf113..7ec130d 100644
--- a/Lib/test/test_random.py
+++ b/Lib/test/test_random.py
@@ -180,10 +180,9 @@
 
     def test_bigrand(self):
         # Verify warnings are raised when randrange is too large for random()
-        oldfilters = warnings.filters[:]
-        warnings.filterwarnings("error", "Underlying random")
-        self.assertRaises(UserWarning, self.gen.randrange, 2**60)
-        warnings.filters[:] = oldfilters
+        with test_support.guard_warnings_filter():
+            warnings.filterwarnings("error", "Underlying random")
+            self.assertRaises(UserWarning, self.gen.randrange, 2**60)
 
 class SystemRandom_TestBasicOps(TestBasicOps):
     gen = random.SystemRandom()
@@ -441,6 +440,14 @@
             self.assertEqual(k, numbits)        # note the stronger assertion
             self.assert_(2**k > n > 2**(k-1))   # note the stronger assertion
 
+    def test_randrange_bug_1590891(self):
+        start = 1000000000000
+        stop = -100000000000000000000
+        step = -200
+        x = self.gen.randrange(start, stop, step)
+        self.assert_(stop < x <= start)
+        self.assertEqual((x+stop)%step, 0)
+
 _gammacoeff = (0.9999999999995183, 676.5203681218835, -1259.139216722289,
               771.3234287757674,  -176.6150291498386, 12.50734324009056,
               -0.1385710331296526, 0.9934937113930748e-05, 0.1659470187408462e-06)
diff --git a/Lib/test/test_repr.py b/Lib/test/test_repr.py
index 1dfa282..823298b 100644
--- a/Lib/test/test_repr.py
+++ b/Lib/test/test_repr.py
@@ -136,7 +136,6 @@
             '<built-in method split of str object at 0x'))
 
     def test_xrange(self):
-        import warnings
         eq = self.assertEquals
         eq(repr(xrange(1)), 'xrange(1)')
         eq(repr(xrange(1, 2)), 'xrange(1, 2)')
diff --git a/Lib/test/test_set.py b/Lib/test/test_set.py
index a4830d4..6641ff8 100644
--- a/Lib/test/test_set.py
+++ b/Lib/test/test_set.py
@@ -21,6 +21,11 @@
     def __eq__(self, other):
         raise RuntimeError
 
+class ReprWrapper:
+    'Used to test self-referential repr() calls'
+    def __repr__(self):
+        return repr(self.value)
+
 class TestJointOps(unittest.TestCase):
     # Tests common to both set and frozenset
 
@@ -244,6 +249,30 @@
             self.assertRaises(RuntimeError, s.discard, BadCmp())
             self.assertRaises(RuntimeError, s.remove, BadCmp())
 
+    def test_cyclical_repr(self):
+        w = ReprWrapper()
+        s = self.thetype([w])
+        w.value = s
+        if self.thetype == set:
+            self.assertEqual(repr(s), '{set(...)}')
+        else:
+            name = repr(s).partition('(')[0]    # strip class name
+            self.assertEqual(repr(s), '%s([%s(...)])' % (name, name))
+
+    def test_cyclical_print(self):
+        w = ReprWrapper()
+        s = self.thetype([w])
+        w.value = s
+        try:
+            fo = open(test_support.TESTFN, "wb")
+            print >> fo, s,
+            fo.close()
+            fo = open(test_support.TESTFN, "rb")
+            self.assertEqual(fo.read(), repr(s))
+        finally:
+            fo.close()
+            os.remove(test_support.TESTFN)
+
 class TestSet(TestJointOps):
     thetype = set
 
diff --git a/Lib/test/test_struct.py b/Lib/test/test_struct.py
index 302698b..d4744dd 100644
--- a/Lib/test/test_struct.py
+++ b/Lib/test/test_struct.py
@@ -50,22 +50,17 @@
 
 def with_warning_restore(func):
     def _with_warning_restore(*args, **kw):
-        # The `warnings` module doesn't have an advertised way to restore
-        # its filter list.  Cheat.
-        save_warnings_filters = warnings.filters[:]
-        # Grrr, we need this function to warn every time.  Without removing
-        # the warningregistry, running test_tarfile then test_struct would fail
-        # on 64-bit platforms.
-        globals = func.func_globals
-        if '__warningregistry__' in globals:
-            del globals['__warningregistry__']
-        warnings.filterwarnings("error", r"""^struct.*""", DeprecationWarning)
-        warnings.filterwarnings("error", r""".*format requires.*""",
-                                DeprecationWarning)
-        try:
+        with test.test_support.guard_warnings_filter():
+            # Grrr, we need this function to warn every time.  Without removing
+            # the warningregistry, running test_tarfile then test_struct would fail
+            # on 64-bit platforms.
+            globals = func.func_globals
+            if '__warningregistry__' in globals:
+                del globals['__warningregistry__']
+            warnings.filterwarnings("error", r"""^struct.*""", DeprecationWarning)
+            warnings.filterwarnings("error", r""".*format requires.*""",
+                                    DeprecationWarning)
             return func(*args, **kw)
-        finally:
-            warnings.filters[:] = save_warnings_filters[:]
     return _with_warning_restore
 
 def deprecated_err(func, *args):
diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py
index 2829c55..2c19698 100644
--- a/Lib/test/test_support.py
+++ b/Lib/test/test_support.py
@@ -3,7 +3,9 @@
 if __name__ != 'test.test_support':
     raise ImportError, 'test_support must be imported from the test package'
 
+from contextlib import contextmanager
 import sys
+import warnings
 
 class Error(Exception):
     """Base class for regression test exceptions."""
@@ -267,6 +269,48 @@
     print >> get_original_stdout(), '\tfetching %s ...' % url
     fn, _ = urllib.urlretrieve(url, filename)
     return open(fn)
+    
+@contextmanager
+def guard_warnings_filter():
+    """Guard the warnings filter from being permanently changed."""
+    original_filters = warnings.filters[:]
+    try:
+        yield
+    finally:
+        warnings.filters = original_filters
+
+class EnvironmentVarGuard(object):
+
+    """Class to help protect the environment variable properly.  Can be used as
+    a context manager."""
+
+    def __init__(self):
+        from os import environ
+        self._environ = environ
+        self._unset = set()
+        self._reset = dict()
+
+    def set(self, envvar, value):
+        if envvar not in self._environ:
+            self._unset.add(envvar)
+        else:
+            self._reset[envvar] = self._environ[envvar]
+        self._environ[envvar] = value
+
+    def unset(self, envvar):
+        if envvar in self._environ:
+            self._reset[envvar] = self._environ[envvar]
+            del self._environ[envvar]
+
+    def __enter__(self):
+        return self
+
+    def __exit__(self, *ignore_exc):
+        for envvar, value in self._reset.iteritems():
+            self._environ[envvar] = value
+        for unset in self._unset:
+            del self._environ[unset]
+
 
 #=======================================================================
 # Decorator for running a function in a different locale, correctly resetting
diff --git a/Lib/test/test_tarfile.py b/Lib/test/test_tarfile.py
index 0cebb29..2b39715 100644
--- a/Lib/test/test_tarfile.py
+++ b/Lib/test/test_tarfile.py
@@ -110,7 +110,7 @@
         """Test seek() method of _FileObject, incl. random reading.
         """
         if self.sep != "|":
-            filename = "0-REGTYPE"
+            filename = "0-REGTYPE-TEXT"
             self.tar.extract(filename, dirname())
             f = open(os.path.join(dirname(), filename), "rb")
             data = f.read()
@@ -149,6 +149,16 @@
             s2 = fobj.readlines()
             self.assert_(s1 == s2,
                          "readlines() after seek failed")
+            fobj.seek(0)
+            self.assert_(len(fobj.readline()) == fobj.tell(),
+                         "tell() after readline() failed")
+            fobj.seek(512)
+            self.assert_(len(fobj.readline()) + 512 == fobj.tell(),
+                         "tell() after seek() and readline() failed")
+            fobj.seek(0)
+            line = fobj.readline()
+            self.assert_(fobj.read() == data[len(line):],
+                         "read() after readline() failed")
             fobj.close()
 
     def test_old_dirtype(self):
@@ -280,6 +290,20 @@
             else:
                 self.dst.addfile(tarinfo, f)
 
+    def test_add_self(self):
+        dstname = os.path.abspath(self.dstname)
+
+        self.assertEqual(self.dst.name, dstname, "archive name must be absolute")
+
+        self.dst.add(dstname)
+        self.assertEqual(self.dst.getnames(), [], "added the archive to itself")
+
+        cwd = os.getcwd()
+        os.chdir(dirname())
+        self.dst.add(dstname)
+        os.chdir(cwd)
+        self.assertEqual(self.dst.getnames(), [], "added the archive to itself")
+
 
 class Write100Test(BaseTest):
     # The name field in a tar header stores strings of at most 100 chars.
@@ -601,6 +625,38 @@
         self.assertEqual(tarfile.filemode(0755), '-rwxr-xr-x')
         self.assertEqual(tarfile.filemode(07111), '---s--s--t')
 
+class HeaderErrorTest(unittest.TestCase):
+
+    def test_truncated_header(self):
+        self.assertRaises(tarfile.HeaderError, tarfile.TarInfo.frombuf, "")
+        self.assertRaises(tarfile.HeaderError, tarfile.TarInfo.frombuf, "filename\0")
+        self.assertRaises(tarfile.HeaderError, tarfile.TarInfo.frombuf, "\0" * 511)
+        self.assertRaises(tarfile.HeaderError, tarfile.TarInfo.frombuf, "\0" * 513)
+
+    def test_empty_header(self):
+        self.assertRaises(tarfile.HeaderError, tarfile.TarInfo.frombuf, "\0" * 512)
+
+    def test_invalid_header(self):
+        buf = tarfile.TarInfo("filename").tobuf()
+        buf = buf[:148] + "foo\0\0\0\0\0" + buf[156:] # invalid number field.
+        self.assertRaises(tarfile.HeaderError, tarfile.TarInfo.frombuf, buf)
+
+    def test_bad_checksum(self):
+        buf = tarfile.TarInfo("filename").tobuf()
+        b = buf[:148] + "        " + buf[156:] # clear the checksum field.
+        self.assertRaises(tarfile.HeaderError, tarfile.TarInfo.frombuf, b)
+        b = "a" + buf[1:] # manipulate the buffer, so checksum won't match.
+        self.assertRaises(tarfile.HeaderError, tarfile.TarInfo.frombuf, b)
+
+class OpenFileobjTest(BaseTest):
+    # Test for SF bug #1496501.
+
+    def test_opener(self):
+        fobj = StringIO.StringIO("foo\n")
+        try:
+            tarfile.open("", "r", fileobj=fobj)
+        except tarfile.ReadError:
+            self.assertEqual(fobj.tell(), 0, "fileobj's position has moved")
 
 if bz2:
     # Bzip2 TestCases
@@ -646,6 +702,8 @@
 
     tests = [
         FileModeTest,
+        HeaderErrorTest,
+        OpenFileobjTest,
         ReadTest,
         ReadStreamTest,
         ReadDetectTest,
diff --git a/Lib/test/test_uu.py b/Lib/test/test_uu.py
index 7786316..16a55e4 100644
--- a/Lib/test/test_uu.py
+++ b/Lib/test/test_uu.py
@@ -114,11 +114,11 @@
 
     def test_encode(self):
         try:
-            fin = open(self.tmpin, 'wb')
+            fin = open(self.tmpin, 'w')
             fin.write(plaintext)
             fin.close()
 
-            fin = open(self.tmpin, 'rb')
+            fin = open(self.tmpin, 'r')
             fout = open(self.tmpout, 'w')
             uu.encode(fin, fout, self.tmpin, mode=0644)
             fin.close()
@@ -130,7 +130,7 @@
             self.assertEqual(s, encodedtextwrapped % (0644, self.tmpin))
 
             # in_file and out_file as filenames
-            uu.encode(self.tmpin, self.tmpout, mode=0644)
+            uu.encode(self.tmpin, self.tmpout, self.tmpin, mode=0644)
             fout = open(self.tmpout, 'r')
             s = fout.read()
             fout.close()
@@ -142,11 +142,11 @@
 
     def test_decode(self):
         try:
-            f = open(self.tmpin, 'wb')
+            f = open(self.tmpin, 'w')
             f.write(encodedtextwrapped % (0644, self.tmpout))
             f.close()
 
-            f = open(self.tmpin, 'rb')
+            f = open(self.tmpin, 'r')
             uu.decode(f)
             f.close()
 
@@ -163,11 +163,11 @@
         try:
             f = cStringIO.StringIO(encodedtextwrapped % (0644, self.tmpout))
 
-            f = open(self.tmpin, 'rb')
+            f = open(self.tmpin, 'r')
             uu.decode(f)
             f.close()
 
-            f = open(self.tmpin, 'rb')
+            f = open(self.tmpin, 'r')
             self.assertRaises(uu.Error, uu.decode, f)
             f.close()
         finally:
diff --git a/Lib/test/test_weakref.py b/Lib/test/test_weakref.py
index 1165980..06f4537 100644
--- a/Lib/test/test_weakref.py
+++ b/Lib/test/test_weakref.py
@@ -189,7 +189,7 @@
     # None as the value for the callback, where either means "no
     # callback".  The "no callback" ref and proxy objects are supposed
     # to be shared so long as they exist by all callers so long as
-    # they are active.  In Python 2.3.3 and earlier, this guaranttee
+    # they are active.  In Python 2.3.3 and earlier, this guarantee
     # was not honored, and was broken in different ways for
     # PyWeakref_NewRef() and PyWeakref_NewProxy().  (Two tests.)
 
diff --git a/Lib/threading.py b/Lib/threading.py
index 5655dde..fecd3cc 100644
--- a/Lib/threading.py
+++ b/Lib/threading.py
@@ -636,13 +636,11 @@
         _active_limbo_lock.acquire()
         _active[_get_ident()] = self
         _active_limbo_lock.release()
-        import atexit
-        atexit.register(self.__exitfunc)
 
     def _set_daemon(self):
         return False
 
-    def __exitfunc(self):
+    def _exitfunc(self):
         self._Thread__stop()
         t = _pickSomeNonDaemonThread()
         if t:
@@ -715,9 +713,11 @@
 
 from thread import stack_size
 
-# Create the main thread object
+# Create the main thread object,
+# and make it available for the interpreter
+# (Py_Main) as threading._shutdown.
 
-_MainThread()
+_shutdown = _MainThread()._exitfunc
 
 # get thread-local implementation, either from the thread
 # module, or from the python fallback
diff --git a/Lib/urllib.py b/Lib/urllib.py
index 90f7aa0..27ec2c9 100644
--- a/Lib/urllib.py
+++ b/Lib/urllib.py
@@ -405,8 +405,8 @@
                 h.putheader('Content-Length', '%d' % len(data))
             else:
                 h.putrequest('GET', selector)
-            if proxy_auth: h.putheader('Proxy-Authorization: Basic %s' % proxy_auth)
-            if auth: h.putheader('Authorization: Basic %s' % auth)
+            if proxy_auth: h.putheader('Proxy-Authorization', 'Basic %s' % proxy_auth)
+            if auth: h.putheader('Authorization', 'Basic %s' % auth)
             if realhost: h.putheader('Host', realhost)
             for args in self.addheaders: h.putheader(*args)
             h.endheaders()