When there is a SyntaxError after reading the last input character from the tokenizer and if no newline follows it, the error message used to be `unexpected EOF while parsing`, which is wrong.
There was a problem hiding this comment.
This looks straightforward enough. I have one niggling thought. Why is tok->done set to E_EOF in the first place?
|
If we take an example, where the last character produces a Note that if the toknizer reaches EOF, it cannot backup, because it would go into an endless loop, if it were to do so. |
|
I'm not sure I entirely believe that: But it does look like it always has to do with the final operator ending the file, so you're close. |
|
Ohh, you're right! It's not the last character, it's the last token. In your example, I still think that this is a fix that catches all these cases (I tested your example and a few more, should I maybe add tests for these?) and does not generate any new problems, since |
|
OK, so the I'll merge now. |
When there is a SyntaxError after reading the last input character from
the tokenizer and if no newline follows it, the error message used to be
unexpected EOF while parsing, which is wrong.https://bugs.python.org/issue40267