SF bug #1224621: tokenize module does not detect inconsistent dedents
diff --git a/Misc/NEWS b/Misc/NEWS
index 80f7dc7..1fcd798 100644
--- a/Misc/NEWS
+++ b/Misc/NEWS
@@ -147,6 +147,9 @@
 Library
 -------
 
+- The tokenize module now detects and reports indentation errors.
+  Bug #1224621.
+
 - The tokenize module has a new untokenize() function to support a full
   roundtrip from lexed tokens back to Python sourcecode.  In addition,
   the generate_tokens() function now accepts a callable argument that