tokenizer.c: make coding markup work again.

io.open() now takes all positional parameters (so we can conveniently
call it from C code).

test_tarfile.py no longer uses u"..." literals, but is otherwise still
badly broken.

This is a checkpoint; some more stuff now breaks.
diff --git a/Lib/io.py b/Lib/io.py
index df224e6..f1be881 100644
--- a/Lib/io.py
+++ b/Lib/io.py
@@ -49,7 +49,7 @@
         self.characters_written = characters_written
 
 
-def open(file, mode="r", buffering=None, *, encoding=None, newline=None):
+def open(file, mode="r", buffering=None, encoding=None, newline=None):
     """Replacement for the built-in open function.
 
     Args:
@@ -59,7 +59,6 @@
       buffering: optional int >= 0 giving the buffer size; values
                  can be: 0 = unbuffered, 1 = line buffered,
                  larger = fully buffered.
-    Keywords (for text modes only; *must* be given as keyword arguments):
       encoding: optional string giving the text encoding.
       newline: optional newlines specifier; must be None, '\n' or '\r\n';
                specifies the line ending expected on input and written on