Add untokenize() function to allow full round-trip tokenization.

Should significantly enhance the utility of the module by supporting
the creation of tools that modify the token stream and writeback the
modified result.
diff --git a/Misc/NEWS b/Misc/NEWS
index f15e873..4763598 100644
--- a/Misc/NEWS
+++ b/Misc/NEWS
@@ -141,6 +141,11 @@
 Library
 -------
 
+- The tokenize module has a new untokenize() function to support a full
+  roundtrip from lexed tokens back to Python sourcecode.  In addition,
+  the generate_tokens() function now accepts a callable argument that
+  terminates by raising StopIteration.
+
 - Bug #1196315: fix weakref.WeakValueDictionary constructor.
 
 - Bug #1213894: os.path.realpath didn't resolve symlinks that were the first