Fix an issue in the tokenizer, where a file is opened by fd, but the underlying PyFileIO object wasn created with the closefd attribute true.
Also fix error handling for close() int _fileio.c .  It was incorrect, looking for a negative refcount, and so errors weren't raised.  This is why this issue wasn't caught.
There is a second reason why it isn't seen:  Class IOBase in io.py has a try:/except: around the close() funtion in the __del__() method.  This also masks these error conditions.

This issue was discovered by removing the _set_invalid_parameter_handler() fiddling, thus enabling the C runtime checks on windows.
diff --git a/Parser/tokenizer.c b/Parser/tokenizer.c
index ce8129d..3d52bed 100644
--- a/Parser/tokenizer.c
+++ b/Parser/tokenizer.c
@@ -452,8 +452,8 @@
 		stream = PyObject_CallMethod(io, "open", "ssis",
 					     tok->filename, "r", -1, enc);
 	else
-		stream = PyObject_CallMethod(io, "open", "isis",
-				fileno(tok->fp), "r", -1, enc);
+		stream = PyObject_CallMethod(io, "open", "isisOOO",
+				fileno(tok->fp), "r", -1, enc, Py_None, Py_None, Py_False);
 	if (stream == NULL)
 		goto cleanup;