Here's one more thing we'd like to add to PEP 484. The description is
best gleaned from the issue, in particular
https://github.com/python/mypy/issues/1284#issuecomment-199021176 and
following (we're going with option (A)).
Really brief example:
from typing import NewType
UserId = NewType('UserId', int)
Now to the type checker UserId is a new type that's compatible with
int, but converting an int to a UserId requires a special cast form,
UserId(x). At runtime UserId instances are just ints (not a subclass!)
and UserId() is a dummy function that just returns its argument.
For use cases see the issue. Also send feedback there please.
--
--Guido van Rossum (python.org/~guido)
a) hope this is not something you expect to be on -list, if so - my
apologies!
Getting this message (here using c99 as compiler name, but same issue
with xlc as compiler name)
c99 -qarch=pwr4 -qbitfields=signed -DNDEBUG -O -I. -IInclude -I./Include
-I/data/prj/aixtools/python/python-2.7.11.2/Include
-I/data/prj/aixtools/python/python-2.7.11.2 -c
/data/prj/aixtools/python/python-2.7.11.2/Modules/_ctypes/_ctypes_test.c
-o
build/temp.aix-5.3-2.7/data/prj/aixtools/python/python-2.7.11.2/Modules/_ctypes/_ctypes_test.o
"/data/prj/aixtools/python/python-2.7.11.2/Modules/_ctypes/_ctypes_test.c",
line 387.5: 1506-009 (S) Bit field M must be of type signed int,
unsigned int or int.
"/data/prj/aixtools/python/python-2.7.11.2/Modules/_ctypes/_ctypes_test.c",
line 387.5: 1506-009 (S) Bit field N must be of type signed int,
unsigned int or int.
"/data/prj/aixtools/python/python-2.7.11.2/Modules/_ctypes/_ctypes_test.c",
line 387.5: 1506-009 (S) Bit field O must be of type signed int,
unsigned int or int.
"/data/prj/aixtools/python/python-2.7.11.2/Modules/_ctypes/_ctypes_test.c",
line 387.5: 1506-009 (S) Bit field P must be of type signed int,
unsigned int or int.
"/data/prj/aixtools/python/python-2.7.11.2/Modules/_ctypes/_ctypes_test.c",
line 387.5: 1506-009 (S) Bit field Q must be of type signed int,
unsigned int or int.
"/data/prj/aixtools/python/python-2.7.11.2/Modules/_ctypes/_ctypes_test.c",
line 387.5: 1506-009 (S) Bit field R must be of type signed int,
unsigned int or int.
"/data/prj/aixtools/python/python-2.7.11.2/Modules/_ctypes/_ctypes_test.c",
line 387.5: 1506-009 (S) Bit field S must be of type signed int,
unsigned int or int.
for:
struct BITS {
int A: 1, B:2, C:3, D:4, E: 5, F: 6, G: 7, H: 8, I: 9;
short M: 1, N: 2, O: 3, P: 4, Q: 5, R: 6, S: 7;
};
in short xlC v11 does not like short (xlC v7 might have accepted it, but
"32-bit machines were common then". I am guessing that 16-bit is not
well liked on 64-bit hw now.
reference for xlC v7, where short was (apparently) still accepted:
http://www.serc.iisc.ernet.in/facilities/ComputingFacilities/systems/cluste…
I am taking this is from xlC v7 documentation from the URL, not because
I know it personally.
So - my question: if "short" is unacceptable for POWER, or maybe only
xlC (not tried with gcc) - how terrible is this, and is it possible to
adjust the test so - the test is accurate?
I am going to modify the test code so it is
struct BITS {
signed int A: 1, B:2, C:3, D:4, E: 5, F: 6, G: 7, H: 8, I: 9;
unsigned int M: 1, N: 2, O: 3, P: 4, Q: 5, R: 6, S: 7;
};
And see what happens - BUT - what does this have for impact on python -
assuming that "short" bitfields are not supported?
p.s. not submitting this a bug (now) as it may just be that "you"
consider it a bug in xlC to not support (signed) short bit fields.
p.p.s. Note: xlc, by default, considers bitfields to be unsigned. I was
trying to force them to signed with -qbitfields=signed - and I still got
messages. So, going back to defaults.
I came across a file that had two different coding cookies -- one on
the first line and one on the second. CPython uses the first, but mypy
happens to use the second. I couldn't find anything in the spec or
docs ruling out the second interpretation. Does anyone have a
suggestion (apart from following CPython)?
Reference: https://github.com/python/mypy/issues/1281
--
--Guido van Rossum (python.org/~guido)
There's a more fundamental PEP 484 update that I'd like to add. The
discussion is in https://github.com/python/typing/issues/107.
Currently we don't have a way to talk about arguments and variables
whose type is itself a type or class. The only annotation you can use
for this is 'type' which says "this argument/variable is a type
object" (or a class). But it's often useful to be able to say "this is
a class and it must be a subclass of X".
In fact this was proposed in the original rounds of discussion about
PEP 484, but at the time it felt too far removed from practice to know
quite how it should be used, so I just put it off. But it's been one
of the features that's been requested most by the early adopters of
PEP 484 at Dropbox. So I'd like to add it now.
At runtime this shouldn't do much; Type would be just a generic class
of one parameter that records its one type parameter. The real magic
would happen in the type checker, which will be able to use types
involving Type. It should also be possible to use this with type
variables, so we could write e.g.
T = TypeVar('T', bound=int)
def factory(c: Type[T]) -> T:
<implementation>
This would define factory() as a function whose argument must be a
subclass of int and returning an instance of that subclass. (The
bound= option to TypeVar() is already described in PEP 484, although
mypy hasn't implemented it yet.)
(If I screwed up this example, hopefully Jukka will correct me. :-)
Again, I'd like this to go out with 3.5.2, because it requires adding
something to typing.py (and again, that's allowed because PEP 484 is
provisional -- see PEP 411 for an explanation).
--
--Guido van Rossum (python.org/~guido)
Hi,
I have an API question for you.
I would like to add a new parameter the the showwarning() function of
the warnings module. Problem: it's not possible to do that without
breaking the backward compatibility (when an application replaces
warnings.showwarning(), the warnings allows and promotes that).
I proposed a patch to add a new showmsg() function which takes a
warnings.WarningMessage object:
https://bugs.python.org/issue26568
The design is inspired by the logging module and its logging.LogRecord
class. The warnings.WarningMessage already exists. Since it's class,
it's easy to add new attributes without breaking the API.
- If warnings.showwarning() is replaced by an application, this
function will be called in practice to log the warning.
- If warnings.showmsg() is replaced, again, this function will be
called in practice.
- If both functions are replaced, showmsg() will be called (replaced
showwarning() is ignored)
I'm not sure about function names: showmsg() and formatmsg(). Maybe:
showwarnmsg() and formatwarnmsg()? Bikeshedding.... fight!
The final goal is to log the traceback where the destroyed object was
allocated when a ResourceWarning warning is logged:
https://bugs.python.org/issue26567
Adding a new parameter to warnings make the implementation much more
simple and gives more freedom to the logger to decide how to format
the warning.
Victor
Hello everyone,
I am Wasim Thabraze, a Computer Science Undergraduate. I have thoroughly
gone through the Core-Python GSoC ideas page and have narrowed down my
choices to the project 'Improving Roundup GitHub integration'.
I have experience in building stuff that are connected to GitHub. Openflock
(http://www.openflock.co) is one of such products that I developed.
Can someone please help me in knowing more about the project? I wanted to
know how and where GitHub should be integrated in the
https://bugs.python.org
I hope I can code with Core Python this summer.
Regards,
Wasim
www.thabraze.megithub.com/waseem18
Hey all,
I've put up an incredibly sketchy, terrible ideas page up for Summer of
Code with core python:
https://wiki.python.org/moin/SummerOfCode/2016/python-core
I'm pretty much the worst person to do this since I'm always swamped
with admin stuff and particularly out of touch with what's needed in
Core Python around this time of year, so I'm counting on you all to tell
me that it's terrible and how to fix it. :)
If you don't already have edit privileges on the python wiki and need
them for this, let me know your wiki username and I can get you set up.
(Or similarly, just tell me what needs fixing and make it my problem.)
We're also still looking for more volunteers who can help mentor
students. Let me know if this interests you -- we have a few folk who
can do higher level code reviews, but we need more day-to-day mentors
who can help students keep on track and help them figure out who to ask
for help if they get stuck. I can also find folk who will provide
mentoring for mentors if you'd like to try but don't think you could be
a primary mentor without help!
If you're interested in mentoring, email gsoc-admins(a)python.org and we
can get you the link to sign up. And if you emailed before and didn't
get a response or haven't found a group to work with, feel free to email
again!
Terri
On 13 March 2016 at 01:13, Russell Keith-Magee <russell(a)keith-magee.com> wrote:
> The patches that I've uploaded to Issue23670 [1] show a full cross-platform
> [1] http://bugs.python.org/issue23670
> build process. After you apply that patch, the iOS directory contains a
> meta-Makefile that manages the build process.
Thanks very much for pointing that out. This has helped me understand
a lot more things. Only now do I realize that the four files generated
by pgen and _freeze_importlib are actually already committed into the
Mercurial repository:
Include/graminit.h
Python/graminit.c
Python/importlib.h
Python/importlib_external.h
A question for other Python developers: Why are these generated files
stored in the repository? The graminit ones seem to have been there
since forever (1990). It seems the importlib ones were there due to a
bootstrapping problem, but now that is solved. Antoine
<https://bugs.python.org/issue14928#msg163048> said he kept them in
the repository on purpose, but I want to know why.
If we ignore the cross compiling use case, would there be any other
consequences of removing these generated files from the repository?
E.g. would it affect the Windows build process?
I have two possible solutions in mind: either remove the generated
files from the repository and always build them, or keep them but do
not automatically regenerate them every build. Since they are
generated files, not source files, I would prefer to remove them, but
I want to know the consequences first.
> On Sat, Mar 12, 2016 at 8:48 AM, Martin Panter <vadmium+py(a)gmail.com> wrote:
>> On 11 March 2016 at 23:16, Russell Keith-Magee <russell(a)keith-magee.com>
>> wrote:
>> >
>> > On Sat, Mar 12, 2016 at 6:38 AM, Martin Panter <vadmium+py(a)gmail.com>
>> > wrote:
>> >> I don't understand. After I run Make, it looks like I get working
>> >> executables leftover at Programs/_freeze_importlib and Parser/pgen. Do
>> >> you mean to install these programs with "make install" or something?
>> >
>> >
>> > Making them part of the installable artefacts would be one option, but
>> > they
>> > don't have to be installed, just preserved.
>>
>> What commands are you running that cause them to not be preserved at
>> the end of the build?
>
>
> I don't know - this is where I hit the end of my understanding of the build
> process. All I know for certain is that 3.4.2 doesn't have this problem;
> 3.5.1 does, and Issue22359 (fixed in [3]) is the source of the problem. A
> subsequent fix [4] introduced an additional problem with _freeze_importlib.
>
> [3] https://hg.python.org/cpython/rev/565b96093ec8
> [4] https://hg.python.org/cpython/rev/02e3bf65b2f8
After my realization about the generated files, I think I can solve
this one. Before the changes you identified, the build process
probably thought the generated files were up to date, so it didn't
need to cross-compile pgen or _freeze_importlib. If the generated file
timestamps were out of date (e.g. depending on the order they are
checked out or extracted), the first native build stage would have
fixed them up.
As a relative newcomer I may have missed a long previous discussion re:
linking with OpenSSL and/or LibreSSL.
In an ideal world this would be rtl linking, i.e., underlying
complexities of *SSL libraries are hidden from applications.
In short, when I saw this http://bugs.python.org/issue26465 Title:
Upgrade OpenSSL shipped with python installers, it reminded me I need to
start looking at LibreSSL again - and that, if not already done - might
be something "secure" for python as well.
Michael