[Python-Dev] Unicode identifiers in test files?
Eric V. Smith
eric at trueblade.com
Sat May 4 03:54:04 EDT 2019
On 5/4/19 2:48 AM, Serhiy Storchaka wrote:
> 04.05.19 05:46, Eric V. Smith пише:
>> Is there a policy against using Unicode identifiers in test files?
>>
>> As part of adding !d to f-strings, there's a code path that's only
>> executed if the text of the expression is non-ascii. The easiest way
>> to exercise it, and the way I found a bug, is by using an identifier
>> with Unicode chars. I know we have a policy against this in Lib/, but
>> what about Lib/test/?
>>
>> I could work around this with exec-ing some strings, but that seems
>> like added confusion that I'd avoid with a real Unicode identifier.
>
> Could you use string literals with non-ascii characters? They are more
> used in tests.
Hi, Serhiy.
I could and will, yes: thanks for the suggestion. But for this specific
feature, I also want to test with simple variable-only expressions. I'm
either going to use a unicode identifier in the code, or eval one in a
string. Doing the eval dance just seems like extra work for some sort of
purity that I don't think is important in a test, unless someone can
think of a good reason for it.
Eric
More information about the Python-Dev
mailing list