Re: [Python-ideas] Proposal: Use mypy syntax for function annotations
I am strongly opposed to this entire proposal. As Juancarlo points out, Python programs are small, but very understandable. I think this syntax detracts from that. I'll suggest an alternative further down in my reply. One benefit of Python that makes it so attractive for new programmers and even old programmers alike is that you can usually pick out any piece of Python code and begin to understand it immediately. Even if you come from a different programming language, Python is written in english explicitly using words like "and" and "or". Those constructs, as opposed to "&&" or "||" make the language less scary for new developers and in general easier to read as well. It's also easier to type regular english words (no need to use the shift key). Using the annotation syntax this heavily will detract very much from the readability of Python and from the overall usability as well. Programs are read more times than they are written. Several years ago, before I had any programming experience in any language at all, I needed to edit some Python code to make something I was doing work. Without any experience at all, I was able to look through the (small) program I was editing and figure out exactly what I needed to adjust. Without Python being such a clean, almost English language, that would have been impossible. Though the annotation syntax is already present in Python 3, I would argue that using this for type annotations will get very messy very quickly. If I'm understanding the syntax correctly, writing any function using a large library with many nested subpackages could result in code like this: import twisted.protocols.mice.mouseman def process_mouseman(inputMouseMan: twisted.protocols.mice.mouseman.MouseMan) -> twisted.protocols.mice.mouseman.MouseMan: pass That function definition is 122 characters long. Far more than what PEP8 recommends. Though this example was crafted to illustrate my point (I don't think most people would really write code like this), it is easy to see that this kind of code is possible and may sometimes be written by some less experienced programmers. It demonstrates how messy things can get even with just one parameter. It is also easy to see that it is very difficult to parse out what is going on in that function. Adding type annotations inline makes it very difficult to quickly get an idea of what arguments a function takes and in what order. It detracts from the overall readability of a program and can also lead to very poorly formatted programs that break the guidelines in PEP8. Though I have only demonstrated this for function declarations, the example could also be extended to inline statement comments as well. Things get too messy too quickly. My Alternative Proposal: As an alternative, I would like to propose a syntax that Pycharm already supports: http://www.jetbrains.com/pycharm/webhelp/using-docstrings-to-specify-types.h... Since this type information isn't going to be used at runtime in the regular Python interpreter anyway, why not have it in the function docstring instead? This provides both readability and type checking. Standardizing that syntax or at least adding it as an optional way to check your program would in my opinion be a much better addition to the language. This approach needs no new syntax, keeps readability and allows the programmer to add additional documentation without going over the 80 character limit. Additionally, this approach can be used by documentation generators as well and removes any duplication from the function declaration and the docstring. Here's a taste of what that looks like: class SimpleEquation(object): def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and should also only be -1 if you enjoy carrots (this comment spans 2 lines) :return: int """ return a * b * c Overall, I think overloading function declarations and inline comments is a bad idea. It promotes writing code with poor readability and in general adds a lot of extra bits to the language that (from the sounds of your proposal) aren't even going to be used by the main interpreter. On the original proposal: These changes really do seem to be overestimating the staticness of Python programs as well. What about functions that don't care about the type? What about functions that only want you to pass in an object that implements __iter__? Python should not become a language where developers are required to add hundreds of odd cast() calls every time they choose to pass a different, but still compatible type, to a function. This syntax makes too many assumptions about what developers know about their code. What if I develop a similar, but different class that is compatible with an existing function? If that function doesn't specify that my class can be used, my perfectly valid code will be rejected. -1 to adding mypy annotations to Python 3. Sunjay On Thu, Aug 14, 2014 at 10:59 AM, Juancarlo Añez <apalala@gmail.com> wrote:
On Wed, Aug 13, 2014 at 3:14 PM, Guido van Rossum <guido@python.org> wrote:
I am proposing that we adopt whatever mypy uses here, keeping discussion of the details (mostly) out of the PEP. The goal is to make it possible to add type checking annotations to 3rd party modules (and even to the stdlib) while allowing unaltered execution of the program by the (unmodified) Python 3.5 interpreter.
To the bottom of things...
About the second time I wrote about Python ("Why not Python", 2007) I dismissed it as serious software development environment because the lack of static type checking hindered the creation of proper software development environments.
http://blog.neogeny.org/why-not-python.html
So, Why do I now have doubts about adding support for static type checking?
I've been programming in almost-only Python for several years now, and this discussion had me think hard about "Why?".
The answer is simple: I never was as productive as I've been since I've centered on Python.
But, again Why?
Despite what my '07 article says, the IDE I use is pythonized-VIM and the command line. Where does the productivity come from?
1. Readability with the right amount of succinctness. Python programs are very small, but understandable. 2. The breadth and design consistency of the standard library. Some 70%? of what I need is there, and the design consistency makes it easy (intiutive) to use. 3. PyPi covers another 28%. 4. The Zen of Python (import this) permeates all of the above, including most third-party packages. The ecosystem is consistent too. It's a culture.
What do I fear? I think it is that Python be transformed into a programming language different from the one that now makes me so productive.
I studied Ruby, and I don't like it. I've been studying Go, and I don't like it. One must like the concepts and the power, sure, but the syntax required for some day-to-day stuff stinks like trouble; simple stuff is so complicated to express and so easy to get wrong...
I hate "List[str]" and "Dict[str:int]". Where did those come from? Shouldn't they (as others have proposed) be "[str]" and "{str:int}"? What about tuples? Why not write a similar, but different programming language that targets the Cython runtime and includes all the desired features?
All said, this is my proposal.
The PSF could support (even fund) MyPy and similar projects, promoting their maturity and their convergence. The changes in 3.5 would be limited but enough to enable those efforts, and those of the several IDE tool-smiths (changes in annotations, and maybe in ABCs). Basically, treat MyPy as PyPy or NumPy (which got '::'). It's in Python's history to enable third-party developments and then adopt what's mature or become the de-facto standard. Then, on a separate line of work, it would be good to think about how to enable different programming languages to target the CPython environment (because of #2, #3, and #4 above), maybe by improving AST creation and AST-to-bytecode? There could be other languages targeting the CPython runtime, which is the relationship that Scala, Jython, IronPython, and others have to their own runtimes.
-1 for standardizing static type checking in 3.5
Cheers,
-- Juancarlo *Añez*
_______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
-- Sunjay Varma Python Programmer & Web Developer www.sunjay.ca
On 08/14/2014 09:01 AM, Sunjay Varma wrote:
Additionally, this approach can be used by documentation generators as well and removes any duplication from the function declaration and the docstring.
Here's a taste of what that looks like: class SimpleEquation(object): def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and should also only be -1 if you enjoy carrots (this comment spans 2 lines) :return: int """ return a * b * c
+1 I like this much more. -- ~Ethan~
On Thu, Aug 14, 2014 at 6:29 PM, Ethan Furman <ethan@stoneleaf.us> wrote:
On 08/14/2014 09:01 AM, Sunjay Varma wrote:
Additionally, this approach can be used by documentation generators as well and removes any duplication from the function declaration and the docstring.
Here's a taste of what that looks like: class SimpleEquation(object): def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and should also only be -1 if you enjoy carrots (this comment spans 2 lines) :return: int """ return a * b * c
+1 I like this much more.
+1 from me as well. I like this much, much more as well. It is simply far more readable and easier on the eyes, especially for functions with complicated definitions. This feels like something I would actually use, without it being a burden. I think it could also work very nicely for keyword arguments. N.
On Thu, Aug 14, 2014 at 12:01:37PM -0400, Sunjay Varma wrote:
Though the annotation syntax is already present in Python 3, I would argue that using this for type annotations will get very messy very quickly. If I'm understanding the syntax correctly, writing any function using a large library with many nested subpackages could result in code like this:
import twisted.protocols.mice.mouseman
def process_mouseman(inputMouseMan: twisted.protocols.mice.mouseman.MouseMan) -> twisted.protocols.mice.mouseman.MouseMan: pass
I would write that like this: from twisted.protocols.mice.mouseman import MouseMan def process_mouseman(inputMouseMan: MouseMan) -> MouseMan: pass
That function definition is 122 characters long.
Or 58.
It is also easy to see that it is very difficult to parse out what is going on in that function.
Only because I have no idea what MouseMan means :-)
As an alternative, I would like to propose a syntax that Pycharm already supports: http://www.jetbrains.com/pycharm/webhelp/using-docstrings-to-specify-types.h... [...] Here's a taste of what that looks like: class SimpleEquation(object):
def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and should also only be -1 if you enjoy carrots (this comment spans 2 lines) :return: int """ return a * b * c
I really dislike that syntax. I dislike adding cruft like "@type" and ":param" into docstrings, which should be written for human readers, not linters. I dislike that you have documented that self is a SimpleEquation. (What else could it be?) I dislike that the syntax clashes with ReST syntax. I dislike that it isn't obvious to me why the first parameter uses @type while the second parameter uses :param.
Overall, I think overloading function declarations and inline comments is a bad idea. It promotes writing code with poor readability
I like the annotation syntax. I'm not completely convinced that the mypy syntax is mature enough to bless, but the basic idea of type annotations is pretty common in dozens of languages. I think you are in a tiny minority if you think that putting the type declaration right next to the parameter make it *less* clear that putting the type declaration in a completely different part of the code. # the type is together with the parameter def frobinate(x: Spam, y: Egg)->Breakfast: # the type declaration and parameter are distantly apart def frobinate(x, y): """Return the frobinated x and y. Some more text goes here. Perhaps lots of text. :param x: Spam :param y: Eggs :return: Breakfast """
On the original proposal: These changes really do seem to be overestimating the staticness of Python programs as well. What about functions that don't care about the type?
They can declare that they are object. Or not declare a type at all.
What about functions that only want you to pass in an object that implements __iter__?
I would expect this should work: from typing import Iter def func(it:Iter): ...
Python should not become a language where developers are required to add hundreds of odd cast() calls every time they choose to pass a different, but still compatible type, to a function.
I'm not sure how you go from *optional* static typing to developers being *required* to cast values. As I see it, one HUGE advantage of this proposal is that people who want strict static typing currently might write code like this: def make_sandwich(filling): if not isinstance(filling, Ham): raise TypeError ... With the new proposal, they will probably write this: def make_sandwich(filling: Ham): ... and allow the static type check to occur at compile time. That means that if I want to pass a Spam instance instead of a Ham instance, all I need do is disable the compile-time type check, and make_sandwich will happily accept anything that has the same duck-type interface as Ham, like Spam. If I pass an int instead, I'll get the same run-time error that I would have got if make_sandwich did not include an explicit type check. So, I think this proposal might actually lead to *more* duck typing rather than less, since you can always turn off the type checking. -- Steven
On 14 August 2014 19:15, Steven D'Aprano <steve@pearwood.info> wrote:
I really dislike that syntax. I dislike adding cruft like "@type" and ":param" into docstrings, which should be written for human readers, not linters.
That ship has long-since sailed. Sphinx uses exactly this :param: and :return: syntax for its docstring parsing. It is by now a common convention (at least, I see it all over the place in open source code), and should not be considered a surprise. I appreciate that it doesn't lead to clean docstrings, but I've found it leads to docstrings that are genuinely written to be read (because they're part of your documentation).
So, I think this proposal might actually lead to *more* duck typing rather than less, since you can always turn off the type checking.
I found this conclusion impossible to understand: have I missed something Steven? To my eyes, the fact that when run by a user who knows nothing about the static type checker much duck typing will fail will clearly not lead to more duck typing. It will lead either to a) less duck typing because of all the bug reports (your code breaks whenever I try to run it!), or b) everyone turning the static type checker off. That objection assumes the static checker would be on by default. If it were off by default but available, both of these problems go away but we're back in the situation we're in right now. In that case, I don't see why we'd add this to CPython.
On Thu, Aug 14, 2014 at 07:25:04PM +0100, Cory Benfield wrote:
On 14 August 2014 19:15, Steven D'Aprano <steve@pearwood.info> wrote:
I really dislike that syntax. I dislike adding cruft like "@type" and ":param" into docstrings, which should be written for human readers, not linters.
That ship has long-since sailed. Sphinx uses exactly this :param: and :return: syntax for its docstring parsing. It is by now a common convention (at least, I see it all over the place in open source code), and should not be considered a surprise.
I've seen it too, but not in docstrings written in vanilla ReST. That's a disappointment to hear that Sphinx uses it, because I think it is hideously ugly :-(
So, I think this proposal might actually lead to *more* duck typing rather than less, since you can always turn off the type checking.
I found this conclusion impossible to understand: have I missed something Steven? To my eyes, the fact that when run by a user who knows nothing about the static type checker much duck typing will fail will clearly not lead to more duck typing. It will lead either to a) less duck typing because of all the bug reports (your code breaks whenever I try to run it!), or b) everyone turning the static type checker off.
Let me explain my reasoning. Back in the Old Days, before Python 2.2, there was no isinstance(). We were strongly discouraged from doing type checks, instead we were encouraged to rely on duck-typing and that functions would fail loudly if passed the wrong argument. With the introduction of isinstance, Python code has slowly, gradually, begun using more and more run-time explicit type checks with isinstance. Some people do this more than others. Let's consider Fred, who is a Java programmer at heart and so writes code like this: def foo(x): if not instance(x, float): raise TypeError("Why doesn't python check this for me?") return (x+1)/2 I want to pass a Decimal to foo(), but can't, because of the explicit type check. I am sad. But with this proposal, Fred may write his function like this: def foo(x:float)->float: return (x+1)/2 and rely on mypy to check the types at compile time. Fred is happy: he has static type checks, Python does it automatically for him (once he has set up his build system to call mypy), and he is now convinced that foo() is type-safe and an isinstance check at run-time would be a waste of cycles. I want to pass a Decimal to foo(). All I have to do is *not* install mypy, or disable it, and lo and behold, like magic, the type checking doesn't happen, and foo() operates by duck-typing just like in the glory days of Python 1.5. Both Fred and I are now happy, and with the explicit isinstance check removed, the only type checking that occurs when I run Fred's library are the run-time duck-typing checks. -- Steven
On Fri, Aug 15, 2014 at 04:52:45AM +1000, Steven D'Aprano <steve@pearwood.info> wrote:
But with this proposal, Fred may write his function like this:
def foo(x:float)->float: return (x+1)/2
and rely on mypy to check the types at compile time. Fred is happy: he has static type checks, Python does it automatically for him (once he has set up his build system to call mypy), and he is now convinced that foo() is type-safe and an isinstance check at run-time would be a waste of cycles.
I want to pass a Decimal to foo(). All I have to do is *not* install mypy, or disable it, and lo and behold, like magic, the type checking doesn't happen, and foo() operates by duck-typing just like in the glory days of Python 1.5. Both Fred and I are now happy, and with the explicit isinstance check removed, the only type checking that occurs when I run Fred's library are the run-time duck-typing checks.
Well, that's funny. Static type checking as a way to subvert type checking! (-: Oleg. -- Oleg Broytman http://phdru.name/ phd@phdru.name Programmers don't die, they just GOSUB without RETURN.
On 14 August 2014 19:52, Steven D'Aprano <steve@pearwood.info> wrote:
I want to pass a Decimal to foo(). All I have to do is *not* install mypy, or disable it, and lo and behold, like magic, the type checking doesn't happen, and foo() operates by duck-typing just like in the glory days of Python 1.5. Both Fred and I are now happy, and with the explicit isinstance check removed, the only type checking that occurs when I run Fred's library are the run-time duck-typing checks.
Thanks for explaining Steven, that's a lot clearer. I understand where you're coming from now. I still don't agree, however. I suspect what's more likely to happen is that Fred writes his code, a user goes to run it with duck typing, and it breaks. Assuming the static checker is in CPython and on by default, there are a number of options here, most of which are bad: 1. The user doesn't know about the type checker and Googles the problem. They find there's a flag they can pass to make the problem go away, so they do. They have now learned a bad habit: to silence these errors, pass this flag. They can no longer gain any benefits from the type checker: it may as well have been not there (or off by default). 2. The user doesn't know about the type checker and blames Fred's library, opening a bug report. In extreme cases, for popular libraries, this will happen so often that Fred will either relent and remove the annotations or get increasingly frustrated and take it out on the users. (I'm speaking from experience in this regard.) 3. The user knows about the type checker and isn't using it. They turn it off. Fine, this is ok. 4. The user knows about the type checker but is using it for their own code in the same program. They're between a rock and a hard place: either they turn off the checker and lose the benefit in their own code, or they stop duck typing. This is actually the worst of these cases. Basically, my objection is to the following (admittedly extreme) case: a static type checker that is a) present in the core distribution, b) on by default, and c) with the only available scope being per-program. I think that such an implementation is a recipe for having everyone learn to turn the checker off, wasting all the effort associated with it. I am much happier if (b) goes away. Off by default is fine. Not in the core distribution at all is also fine (because it's effectively off-by-default). Allowing refined scopes is also a good idea, but doesn't solve the core problem: people will continue to just turn it off. I am not averse to having static checking be an option for Python and for annotations to be the mechanism by which such typing is done. I just think we should be really cautious about ever including it in CPython.
On 08/14/2014 01:33 PM, Cory Benfield wrote:
On 14 August 2014 19:52, Steven D'Aprano wrote:
I want to pass a Decimal to foo(). All I have to do is *not* install mypy, or disable it, and lo and behold, like magic, the type checking doesn't happen, and foo() operates by duck-typing just like in the glory days of Python 1.5. Both Fred and I are now happy, and with the explicit isinstance check removed, the only type checking that occurs when I run Fred's library are the run-time duck-typing checks.
Thanks for explaining Steven, that's a lot clearer. I understand where you're coming from now.
I still don't agree, however. I suspect what's more likely to happen is that Fred writes his code, a user goes to run it with duck typing, and it breaks. Assuming the static checker is in CPython and on by default, there are a number of options here, most of which are bad:
These are bad assumptions, since the PEP is about defining how annotations are to be used and specifically states there will be *no run-time checking*. To be of use at all you have to get a third-party program (mypy at this point) and use it. So the scenario you list simply isn't going to happen... at least, not like that. What could happen is newbie team member tries to check something in, but mypy and annotations are in the pre-check, no-one has told newbie team member about mypy or newbie forgot and is too embarrassed to go ask someone, so same basic problem arises. That, however, is mostly outside the concerns of developing Python. -- ~Ethan~
On Thu, Aug 14, 2014 at 1:33 PM, Cory Benfield <cory@lukasa.co.uk> wrote:
On 14 August 2014 19:52, Steven D'Aprano <steve@pearwood.info> wrote:
I want to pass a Decimal to foo(). All I have to do is *not* install mypy, or disable it, and lo and behold, like magic, the type checking doesn't happen, and foo() operates by duck-typing just like in the glory days of Python 1.5. Both Fred and I are now happy, and with the explicit isinstance check removed, the only type checking that occurs when I run Fred's library are the run-time duck-typing checks.
Thanks for explaining Steven, that's a lot clearer. I understand where you're coming from now.
I still don't agree, however. I suspect what's more likely to happen is that Fred writes his code, a user goes to run it with duck typing, and it breaks. Assuming the static checker is in CPython and on by default, there are a number of options here, most of which are bad:
These assumptions are incorrect. * Adding the checker to CPython is not part of this proposal. * Turning it on by default is not part of this proposal. * Having the type checker in any way associated with the runtime is not part of this proposal. What is part of this proposal? * An effort to standardize on a particular syntax for optional type annotations using Python 3 function annotations * The syntax will be handwavingly based on what's in mypy, and thus implies some semantics but not the implementation of the type checker/inference/etc. * A suggestion to use tools such as mypy to provide a static type checking pass in much the same way that people use linters today * A suggestion that IDEs, documentation tools, etc. take advantage of the information provided by the type annotations * Deprecation of using function annotations for any other purpose. They can't really be composed, and ideally type checkers can work primarily at the syntax level and not have to evaluate the module with a full Python interpreter in order to extract the annotations, so it's best to keep the syntax uniform.
I am not averse to having static checking be an option for Python and for annotations to be the mechanism by which such typing is done. I just think we should be really cautious about ever including it in CPython.
So, it sounds like you're not averse to what is has been proposed. :) -bob
On Thu, Aug 14, 2014 at 01:54:20PM -0700, Bob Ippolito wrote:
What is part of this proposal?
* An effort to standardize on a particular syntax for optional type annotations using Python 3 function annotations
+1 on that.
* The syntax will be handwavingly based on what's in mypy, and thus implies some semantics but not the implementation of the type checker/inference/etc.
I have some reservations as to whether the mypy syntax is mature enough to bake in as standard, but other than that reservation, I prefer the mypy approach over putting type declarations in docstrings. +1
* A suggestion to use tools such as mypy to provide a static type checking pass in much the same way that people use linters today
"Tools such as mypy" is a very important point. If there is a standard syntax for type annotations, there's no reason why other linters couldn't support it as well. +1
* A suggestion that IDEs, documentation tools, etc. take advantage of the information provided by the type annotations
+1
* Deprecation of using function annotations for any other purpose. They can't really be composed, and ideally type checkers can work primarily at the syntax level and not have to evaluate the module with a full Python interpreter in order to extract the annotations, so it's best to keep the syntax uniform.
-1 I don't think this one is justified. At the very least, I think the decision to deprecate or not should be deferred until at least 3.7. It's enough to say that: - you can use function annotations for type checking; or - you can use function annotations for something else; but not both at the same time. I don't think it is a big burden to have mypy and other linters support an "opt-out" decorator, say, so that projects can use annotations for something else without confusing the linter. As I understand it, the current behaviour of mypy is that you have to import typing in the module before it will type check the module, so that already gives you a way to skip type annotations on a per-module basis: just don't import typing. -- Steven
On Fri, Aug 15, 2014 at 10:14 AM, Steven D'Aprano <steve@pearwood.info> wrote:
I don't think this one is justified. At the very least, I think the decision to deprecate or not should be deferred until at least 3.7. It's enough to say that:
- you can use function annotations for type checking; or
- you can use function annotations for something else;
But who is "you"? Presumably the application author. What about imported modules - what will they use annotations for? Will conflicting uses break stuff? Or, conversely, will every use of annotations have to be meta-annotated with its purpose, to try to avoid breaking things? Simpler to just say "this feature is standardly used for this purpose", and let people break that convention at their own risk if they like. Deprecation of other options fits this. ChrisA
On Fri, Aug 15, 2014 at 10:29:21AM +1000, Chris Angelico wrote:
On Fri, Aug 15, 2014 at 10:14 AM, Steven D'Aprano <steve@pearwood.info> wrote:
I don't think this one is justified. At the very least, I think the decision to deprecate or not should be deferred until at least 3.7. It's enough to say that:
- you can use function annotations for type checking; or
- you can use function annotations for something else;
But who is "you"? Presumably the application author.
The author of the code containing the annotations.
What about imported modules - what will they use annotations for?
Whatever they choose. If they import typing and don't explicitly disable typechecking, they will get the default meaning of annotations, which is typechecking. If they perform whatever step is required to tell the linter "don't check here", the linter will treat that module as if it had no annotations. (That doesn't necessarily mean ignoring the module, it only means ignore the annotations. A type checker with type inference might still be able to work with the module, so long as it needs no type hints.)
Will conflicting uses break stuff? Or, conversely, will every use of annotations have to be meta-annotated with its purpose, to try to avoid breaking things?
No. I think Guido has the right instinct to make annotations' default purpose be for type-checking, but it's easy enough to make it opt-out. Since this is Python-Ideas, here are some ideas: * If "typing" is not imported at all in the module, linters should ignore annotations inside that module. * Have a decorator that tells linters and other tools "these are not type annotations": from typing import skip @skip def function(x: Spam, y: Eggs)->Cheese: pass Compliant linters and IDEs will now skip function, treating it as if it had no annotations at all, leaving the interpretation of the annotations up to the author of the module. Non-compliant linters, of course, should be beaten with a large halibut, like any other buggy software :-) The last one will probably require a standardized convention for how compliant tools recognise whether or not annotations are for them. Perhaps something like: if '+state' in func.__annotations__: # skip type-checking. I picked '+state' because it is an invalid identifier and so cannot clash with any parameter name. The value of __annotations__['+state'] can remain unspecified, different tools could use it for whatever they like without Python's blessing, only the existence of that key is enough to mark the function as "don't treat these as type annotations". -- Steven
On Thu, Aug 14, 2014 at 5:14 PM, Steven D'Aprano <steve@pearwood.info> wrote:
On Thu, Aug 14, 2014 at 01:54:20PM -0700, Bob Ippolito wrote:
* Deprecation of using function annotations for any other purpose. They can't really be composed, and ideally type checkers can work primarily at the syntax level and not have to evaluate the module with a full Python interpreter in order to extract the annotations, so it's best to keep the syntax uniform.
-1
I don't think this one is justified. At the very least, I think the decision to deprecate or not should be deferred until at least 3.7. It's enough to say that:
- you can use function annotations for type checking; or
- you can use function annotations for something else;
but not both at the same time. I don't think it is a big burden to have mypy and other linters support an "opt-out" decorator, say, so that projects can use annotations for something else without confusing the linter.
Certainly using annotations for purposes other than type checking would be permissible, but discouraged. No enforcement of any kind is proposed, just language to go in the documentation. It would be non-trivial for the Python compiler or runtime to enforce it anyway! For that sort of annotation, a decorator probably isn't best, some module-level syntax would likely be more appropriate.
As I understand it, the current behaviour of mypy is that you have to import typing in the module before it will type check the module, so that already gives you a way to skip type annotations on a per-module basis: just don't import typing.
The current implementation of mypy will typecheck any module that uses annotations or imports from typing. -bob
On Thu, Aug 14, 2014 at 2:22 PM, Steven D'Aprano <steve@pearwood.info> wrote:
def foo(x:float)->float: return (x+1)/2
and rely on mypy to check the types at compile time. Fred is happy: he has static type checks, Python does it automatically for him (once he has set up his build system to call mypy), and he is now convinced that foo() is type-safe and an isinstance check at run-time would be a waste of cycles.
The foo() kind of examples won't cut it. The standard library and other important Python libraries will take an argument of any of several "reasonable" but otherwise unrelated types... and do the right thing. Such is duck-typing. Trying to find the "common abstract type" to cast it in stone is a waste of time. For example, see https://docs.python.org/3.3/library/json.html#json.dump, or take a look at http://pyyaml.org/. In fact, Why aren't we instead discussing the much more interesting topic of *type inference*, as it's going on in projects like PyDev, PyCharm, Rope, and others? I would be all-in for an approach that helps make code-completion tools more precise and more available, and that would enable linters to tell me why a certain call will probably not work. It could be a milder approach, with "type hinting" for when the tools can't infer the type, instead of one of "type specification". def set_color(self, color): ... o.set_color("red") It could well be that 'color' should actually be one of the constants in a Color enum, and the call should fail statically... if it was Java. In the Python way, the method will probably figure out the caller's intentions, and do the right thing. Cheers, p.s. I just had an epiphany: This discussion is not about empowering Python programmers, but again about finding "magic" to allow bad (read "cheap") programmers to write good programs. I'm out! -- Juancarlo *Añez*
On 08/14/2014 11:15 AM, Steven D'Aprano wrote:
I like the annotation syntax. I'm not completely convinced that the mypy syntax is mature enough to bless, but the basic idea of type annotations is pretty common in dozens of languages. I think you are in a tiny minority if you think that putting the type declaration right next to the parameter make it *less* clear that putting the type declaration in a completely different part of the code.
# the type is together with the parameter def frobinate(x: Spam, y: Egg)->Breakfast:
# the type declaration and parameter are distantly apart def frobinate(x, y): """Return the frobinated x and y.
Some more text goes here. Perhaps lots of text.
:param x: Spam :param y: Eggs :return: Breakfast """
Sure, keeping that info in the annotations makes more sense, but I'd rather see it in the doc string instead of ruling out all other possible uses of annotations -- particularly for something that's supposed to be /optional/. -- ~Ethan~
On Aug 14, 2014, at 11:33 AM, Ethan Furman wrote:
def frobinate(x, y): """Return the frobinated x and y.
Some more text goes here. Perhaps lots of text.
:param x: Spam :param y: Eggs :return: Breakfast """
Sure, keeping that info in the annotations makes more sense, but I'd rather see it in the doc string instead of ruling out all other possible uses of annotations -- particularly for something that's supposed to be /optional/.
Docstring annotations almost by definition can contain more information useful to the human reader than type annotations can, especially if you carefully use the reST-ish epydoc convention of both :param: and :type:. The latter contains the type annotation (which an automated system could utilize) while the former contains the exposition (for the benefit of the human reader). It's the explanations that are missing from any type annotations. I suppose you could intersperse comments with your type annotations, resulting in a long multiline function signature. I doubt that would be a readability improvement. Cheers, -Barry
On 08/15/2014 03:49 PM, Barry Warsaw wrote:
On Aug 14, 2014, at 11:33 AM, Ethan Furman wrote:
def frobinate(x, y): """Return the frobinated x and y.
Some more text goes here. Perhaps lots of text.
:param x: Spam :param y: Eggs :return: Breakfast """
Sure, keeping that info in the annotations makes more sense, but I'd rather see it in the doc string instead of ruling out all other possible uses of annotations -- particularly for something that's supposed to be /optional/.
Docstring annotations almost by definition can contain more information useful to the human reader than type annotations can, especially if you carefully use the reST-ish epydoc convention of both :param: and :type:. The latter contains the type annotation (which an automated system could utilize) while the former contains the exposition (for the benefit of the human reader). It's the explanations that are missing from any type annotations.
I suppose you could intersperse comments with your type annotations, resulting in a long multiline function signature. I doubt that would be a readability improvement.
Sounds like what we *really* need is a decorator that will parse the docstring and fill in the annotations automatically. :) -- ~Ethan~
On Fri, Aug 15, 2014 at 7:46 PM, Ethan Furman <ethan@stoneleaf.us> wrote:
Sounds like what we *really* need is a decorator that will parse the docstring and fill in the annotations automatically. :)
While it may be contrary to the TOOWTDI principle, but I think decorator- and annotations-based type specifications can co-exist. I have been using the following decorator for about a year def returns(type): """emulate -> attribute setting in python 2""" def decorator(func): func.__dict__.setdefault('__attributes__', {})['return'] = type return func return decorator and I often find code that uses for example @returns(int) more readable than the code that uses .. -> int: The advantage of argument annotations over any decorator-based solution is avoidance of repetition, but I often miss K&R-style declarations in C. Probably because int copy(from, to) char *from; char *to; {..} looks more "pythonic" than int copy(char *from, char *to) {} In python, what is more readable: def copy(from:Sequence, to:MutableSequence): .. or @arg_types(from=Sequence, to=MutableSequence) def copy(from, to): .. or def copy(from, to): .. set_arg_types(copy, from=Sequence, to=MutableSequence) ? I believe that having type specifications out of the way in the last variant more than outweighs the need to repeat the names of arguments. Even the repetition can be avoided if we do something like @arg_types(Sequence, MutableSequence) def copy(from, to): .. but this is still more intrusive than the after-body variant where function name repetition is unavoidable and argument names repetition improves readability. I can imagine cases where having type specifications follow each argument is helpful. For example def configure(d:Drawing, c:Configuration): d.background = c.get_parameter('background') d.height = c.get_parameter('height') .. It really depends on the problem at hand and on the coding style whether you may want type specification before or within the function declaration or out of the way in docstring, after the function body or in a separate "stubs" file altogether. If Python gets an official standard for specifying types in function attributes, it should not be hard to standardize docstring and decorator alternatives as well. As a bonus, these alternatives will be immediately available to 2.x users.
On Fri, Aug 15, 2014 at 06:49:09PM -0400, Barry Warsaw wrote:
Docstring annotations almost by definition can contain more information useful to the human reader than type annotations can, especially if you carefully use the reST-ish epydoc convention of both :param: and :type:. The latter contains the type annotation (which an automated system could utilize) while the former contains the exposition (for the benefit of the human reader). It's the explanations that are missing from any type annotations.
I suppose you could intersperse comments with your type annotations, resulting in a long multiline function signature. I doubt that would be a readability improvement.
I'm sorry, I missed the part of Guido's proposal where he said that docstrings would cease to be used for documentation :-) I don't think that it is a serious risk that docstrings will disappear, or that people will try to shove usage comments inside the parameter list: def frobnicate( # this is the thing to be frobnicated obj:int, # pass a truthy object to use the blue frob instead of red frob blue:object=False, # an extra helping of spam yes_please_more_spam:list[Spam] ): any more than they already do. (I think I've written a comment inside a parameter list maybe twice in the last decade.) I don't think there's much risk of that changing. A more exciting outcome would be for documentation tools to start using the type annotations directly, without needing the writer to include the type annotation in two places (the parameter list and the docstring). But before the doc tools can do that, there needs to be a standard for annotations. -- Steven
On Aug 16, 2014, at 1:16 AM, Steven D'Aprano <steve@pearwood.info> wrote:
On Fri, Aug 15, 2014 at 06:49:09PM -0400, Barry Warsaw wrote:
Docstring annotations almost by definition can contain more information useful to the human reader than type annotations can, especially if you carefully use the reST-ish epydoc convention of both :param: and :type:. The latter contains the type annotation (which an automated system could utilize) while the former contains the exposition (for the benefit of the human reader). It's the explanations that are missing from any type annotations.
I suppose you could intersperse comments with your type annotations, resulting in a long multiline function signature. I doubt that would be a readability improvement.
I'm sorry, I missed the part of Guido's proposal where he said that docstrings would cease to be used for documentation :-)
I don't think that it is a serious risk that docstrings will disappear, or that people will try to shove usage comments inside the parameter list:
def frobnicate( # this is the thing to be frobnicated obj:int, # pass a truthy object to use the blue frob instead of red frob blue:object=False, # an extra helping of spam yes_please_more_spam:list[Spam] ):
any more than they already do. (I think I've written a comment inside a parameter list maybe twice in the last decade.) I don't think there's much risk of that changing.
A more exciting outcome would be for documentation tools to start using the type annotations directly, without needing the writer to include the type annotation in two places (the parameter list and the docstring). But before the doc tools can do that, there needs to be a standard for annotations.
Couldn’t the documentation tools also just pull that information from the annotations? --- Donald Stufft PGP: 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
On Aug 15, 2014, at 04:15 AM, Steven D'Aprano wrote:
Here's a taste of what that looks like: class SimpleEquation(object):
def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and should also only be -1 if you enjoy carrots (this comment spans 2 lines) :return: int """ return a * b * c
I really dislike that syntax. I dislike adding cruft like "@type" and ":param" into docstrings, which should be written for human readers, not linters.
For me, the whole point of using syntax like this is for the human reader, especially because I don't have a linter that parses this... yet. <wink> Used judiciously, this syntax could benefit both the human reader *and* automated tools.
I dislike that you have documented that self is a SimpleEquation. (What else could it be?)
FWIW, I never document 'self'.
I dislike that the syntax clashes with ReST syntax.
It needn't. http://epydoc.sourceforge.net/manual-fields.html
I dislike that it isn't obvious to me why the first parameter uses @type while the second parameter uses :param.
It needn't.
I like the annotation syntax. I'm not completely convinced that the mypy syntax is mature enough to bless, but the basic idea of type annotations is pretty common in dozens of languages. I think you are in a tiny minority if you think that putting the type declaration right next to the parameter make it *less* clear that putting the type declaration in a completely different part of the code.
Of course, it's not a completely different part of the code. docstrings naturally live right after the function signature (indeed, or it wouldn't get stuffed into __doc__), so it's always close to the source. That makes it quite easy for the third party human reader, but also for the author to keep up-to-date. Plus, you can *always* fit these reST-ish epydoc field descriptions in PEP 8 line lengths. It always bugs me to see multiline function signatures. Currently those are pretty rare (and IMHO are a code smell), but with type annotations, I suspect they'll be the norm. Cheers, -Barry
Some people like the epydoc-style convention of putting type annotations in docstrings: [...]
def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and should also only be -1 if you enjoy carrots (this comment spans 2 lines) :return: int """
One issue I haven't see raised is that annotations are available at runtime, whereas docstrings may not be. (The -OO switch removes docstrings.) A linter may be able to parse the docstrings at compile time before the docstrings are discarded, or it may not, but using docstrings means the information is not always available for introspection at runtime. I think that's a major disadvantage. Although I admit I don't always remember to test my code using -O and -OO, I do try very hard to do this and I have found bugs in my code from doing so. I think anything which makes testing -O and -OO modes harder is a bad thing. [Quoting Barry Warsaw]
docstrings naturally live right after the function signature (indeed, or it wouldn't get stuffed into __doc__), so it's always close to the source. That makes it quite easy for the third party human reader, but also for the author to keep up-to-date.
*Close to the source* is not the same as *part of the source*. In the example above, the difference is as high as eight lines, compared to zero: # Function annotations def demo(self, a:int, b:int, c:int)->int: Using docstring annotations splits the information about parameters into two places. Those two places might be close, but there are still two sources of ultimate truth instead of one. You have the name of the parameter in the parameter list, and the type of the parameter inside the docstring separated by some arbitrary number of lines of code. -- Steven
On 08/15/2014 10:04 PM, Steven D'Aprano wrote:
Using docstring annotations splits the information about parameters into two places. Those two places might be close, but there are still two sources of ultimate truth instead of one. You have the name of the parameter in the parameter list, and the type of the parameter inside the docstring separated by some arbitrary number of lines of code.
You say that like it's a bad thing. Not having everything crammed into one spot can be good. Too dense is just as bad as too sparse. -- ~Ethan~
Ethan Furman <ethan@stoneleaf.us> wrote:
On 08/15/2014 10:04 PM, Steven D'Aprano wrote:
Using docstring annotations splits the information about parameters into two places. Those two places might be close, but there are still two sources of ultimate truth instead of one. You have the name of the parameter in the parameter list, and the type of the parameter inside the docstring separated by some arbitrary number of lines of code.
You say that like it's a bad thing. Not having everything crammed into one spot can be good. Too dense is just as bad as too sparse.
It *is* a bad thing. ;) Humans can recognize patterns easily, and Steven's example can be understood at a single glance. The information is declarative and the density isn't that high. Density may become a problem if the information is functional and dense, like in APL. Even that can be overcome by training. Stefan Krah
On Sat Aug 16 2014 at 1:05:16 AM Steven D'Aprano <steve@pearwood.info> wrote:
Some people like the epydoc-style convention of putting type annotations in docstrings:
def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and
should
also only be -1 if you enjoy carrots (this comment spans 2
[...] lines)
:return: int """
One issue I haven't see raised is that annotations are available at runtime, whereas docstrings may not be. (The -OO switch removes docstrings.) A linter may be able to parse the docstrings at compile time before the docstrings are discarded, or it may not, but using docstrings means the information is not always available for introspection at runtime. I think that's a major disadvantage.
Although I admit I don't always remember to test my code using -O and -OO, I do try very hard to do this and I have found bugs in my code from doing so. I think anything which makes testing -O and -OO modes harder is a bad thing.
[Quoting Barry Warsaw]
docstrings naturally live right after the function signature (indeed, or it wouldn't get stuffed into __doc__), so it's always close to the source. That makes it quite easy for the third party human reader, but also for the author to keep up-to-date.
*Close to the source* is not the same as *part of the source*. In the example above, the difference is as high as eight lines, compared to zero:
# Function annotations def demo(self, a:int, b:int, c:int)->int:
Using docstring annotations splits the information about parameters into two places. Those two places might be close, but there are still two sources of ultimate truth instead of one. You have the name of the parameter in the parameter list, and the type of the parameter inside the docstring separated by some arbitrary number of lines of code.
I'm with Steven on this. I actively hate docstrings that list every parameter, their expected interface, etc. The parameter list exists for a reason and a majority of the time I don't need an explanation of what a parameter does. In those rare instances where I need clarification I can write a quick sentence in the docstring explaining the special case. """Returns the product of a, b, and c. The 'c' parameter should not be zero. If you like carrots, set it to -1 (this comment spans two lines). """ That's 4 lines compared to 7 (which was missing a blank line to begin with so it really should be 8). We're all adults and properly worded parameter names tell you a lot. What we are trying to do here is help programmatic tools know things that we know to be true. Now that is not to say whatever comes out of typing.py shouldn't be legible and not noisy. I'm sure the reason it uses CapWords for e.g. Dict is so you can do `from typing import *` which makes `def demo(a: Int, b: Int, c: Int) -> Int` read just as cleanly as if you used 'int' itself (this might become the one time I promote using import * so enjoy it while you can =). And as others have pointed out, if you really like the docstring approach you can always set up a decorator to do the translation for you, but you can't go the other way from annotation to docstring when examining source. So while you can promote and use your docstring approach and even argue for the inclusion of such a decorator in typing.py, you can't promote docstrings exclusively without completely cutting off the function annotation approach. And I think enough of us like the function annotation approach that cutting it off entirely isn't acceptable.
Here's a taste of what that looks like: class SimpleEquation(object):
def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and should also only be -1 if you enjoy carrots (this comment spans 2
On 14 Aug 2014 17:02, "Sunjay Varma" <varma.sunjay@gmail.com> wrote: lines)
:return: int """ return a * b * c
There are at least three existing, popular, standardized syntaxes for these kinds of docstring annotations in use: plain ReST, Google's docstring standard, and numpy's docstring standard. All are supported by Sphinx out of the box. (The latter two require enabling the "napolean" extension, but this is literally a one line config file switch.) Would you suggest that python-dev should pick one of these and declare it to be the official standard, or...? -n
Frankly, I'd just really like to get all of this noise out of the function declaration. Any reasonable, readable and consistent documentation format is fine with me. I chose the sphinx format because it is already well supported in pycharm and that was mentioned in the first few responses. I actually don't like the colon syntax very much (it's awkward and unnatural to type), so if anyone has a different suggestion I'd be very open to that. Mainly I want to ensure that Python doesn't sacrifice readability and line length (which is part of readability) just because annotations are already built in. I suggest we decide on a standard format that can be used in documentation strings and also used with type checking. Let's enhance our documentation with types, not obfuscate function declarations. Sunjay On Aug 14, 2014 3:14 PM, "Nathaniel Smith" <njs@pobox.com> wrote:
Here's a taste of what that looks like: class SimpleEquation(object):
def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and should also only be -1 if you enjoy carrots (this comment spans 2
On 14 Aug 2014 17:02, "Sunjay Varma" <varma.sunjay@gmail.com> wrote: lines)
:return: int """ return a * b * c
There are at least three existing, popular, standardized syntaxes for these kinds of docstring annotations in use: plain ReST, Google's docstring standard, and numpy's docstring standard. All are supported by Sphinx out of the box. (The latter two require enabling the "napolean" extension, but this is literally a one line config file switch.)
Would you suggest that python-dev should pick one of these and declare it to be the official standard, or...?
-n
On Thu Aug 14 2014 at 9:02:54 AM Sunjay Varma <varma.sunjay@gmail.com> wrote:
I am strongly opposed to this entire proposal. As Juancarlo points out, Python programs are small, but very understandable. I think this syntax detracts from that. I'll suggest an alternative further down in my reply.
Small? I've got tens of millions of lines of Python code to wrangle that says otherwise. We're trying to create an analyzer and type inferencer so that we can actually make sense of it all to make it easier to both (a) maintain and (b) migrate to Python 3. :)
One benefit of Python that makes it so attractive for new programmers and even old programmers alike is that you can usually pick out any piece of Python code and begin to understand it immediately. Even if you come from a different programming language, Python is written in english explicitly using words like "and" and "or". Those constructs, as opposed to "&&" or "||" make the language less scary for new developers and in general easier to read as well. It's also easier to type regular english words (no need to use the shift key). Using the annotation syntax this heavily will detract very much from the readability of Python and from the overall usability as well. Programs are read more times than they are written.
Several years ago, before I had any programming experience in any language at all, I needed to edit some Python code to make something I was doing work. Without any experience at all, I was able to look through the (small) program I was editing and figure out exactly what I needed to adjust. Without Python being such a clean, almost English language, that would have been impossible.
Though the annotation syntax is already present in Python 3, I would argue that using this for type annotations will get very messy very quickly. If I'm understanding the syntax correctly, writing any function using a large library with many nested subpackages could result in code like this:
import twisted.protocols.mice.mouseman
def process_mouseman(inputMouseMan: twisted.protocols.mice.mouseman.MouseMan) -> twisted.protocols.mice.mouseman.MouseMan: pass
That function definition is 122 characters long. Far more than what PEP8 recommends. Though this example was crafted to illustrate my point (I don't think most people would really write code like this), it is easy to see that this kind of code is possible and may sometimes be written by some less experienced programmers. It demonstrates how messy things can get even with just one parameter.
It is also easy to see that it is very difficult to parse out what is going on in that function. Adding type annotations inline makes it very difficult to quickly get an idea of what arguments a function takes and in what order. It detracts from the overall readability of a program and can also lead to very poorly formatted programs that break the guidelines in PEP8. Though I have only demonstrated this for function declarations, the example could also be extended to inline statement comments as well. Things get too messy too quickly.
My Alternative Proposal: As an alternative, I would like to propose a syntax that Pycharm already supports: http://www.jetbrains.com/pycharm/webhelp/using-docstrings-to-specify-types.h...
Since this type information isn't going to be used at runtime in the regular Python interpreter anyway, why not have it in the function docstring instead? This provides both readability and type checking. Standardizing that syntax or at least adding it as an optional way to check your program would in my opinion be a much better addition to the language. This approach needs no new syntax, keeps readability and allows the programmer to add additional documentation without going over the 80 character limit.
Without commenting on the specific format of the docstring, there is an added benefit to using docstrings for parameter and return value type information: It encourages people to write documentation. (the exact format of types in a docstring could turn into its own bikeshed even grander than this thread already is) JavaScript successfully uses this approach for type annotations. (sure, its in comments, but that's because they don't _have_ docstrings). In a way, using docstrings is similar to what argument clinic does for extension modules. We could provide a standard way to do it and have the language runtime parse them and turn them into actual annotation objects when the __annotations__ attribute is first accessed on anything. If someone really wanted to they could have it hide the information from the docstring at the same time (I don't recommend that. Argument clinic is "special" and had a real a need for this). That laziness *mostly* avoids the forward referencing or forward declaration mess that you'd otherwise have. -gps
On Aug 14, 2014, at 12:01 PM, Sunjay Varma wrote:
Here's a taste of what that looks like: class SimpleEquation(object):
def demo(self, a, b, c): """ This function returns the product of a, b and c @type self: SimpleEquation :param a: int - The first number :param b: int :param c: int - The third number should not be zero and should also only be -1 if you enjoy carrots (this comment spans 2 lines) :return: int """ return a * b * c
I use this all the time. I think I originally adopted this from epydoc: http://epydoc.sourceforge.net/manual-fields.html (i.e. the reStructuredText flavor of epydoc fields.) E.g. def inject_message(mlist, msg, recipients=None, switchboard=None, **kws): """Inject a message into a queue. If the message does not have a Message-ID header, one is added. An X-Message-Id-Hash header is also always added. :param mlist: The mailing list this message is destined for. :type mlist: IMailingList :param msg: The Message object to inject. :type msg: a Message object :param recipients: Optional set of recipients to put into the message's metadata. :type recipients: sequence of strings :param switchboard: Optional name of switchboard to inject this message into. If not given, the 'in' switchboard is used. :type switchboard: string :param kws: Additional values for the message metadata. :type kws: dictionary """ With perhaps a little more formalism (or care on the part of the author <wink>), e.g. a better spelling of "sequence of strings", this format seems much more readable to me. And being tucked away in a docstring, it can really be safely ignored. It also seems like a processor like mypy could use this information just as easily as type annotations. For whatever reason, this style seems much more comfortable to me than trying to encode everything in the function signature. it also has the added benefit of actually describing the purpose and detail of the arguments to a human reader rather than leaving it up to a mental mapping to translate annotated types to semantics or content detail. Cheers, -Barry
participants (16)
-
Alexander Belopolsky -
Barry Warsaw -
Bob Ippolito -
Brett Cannon -
Chris Angelico -
Cory Benfield -
Donald Stufft -
Ethan Furman -
Gregory P. Smith -
Juancarlo Añez -
Nathaniel Smith -
Nicholas Cole -
Oleg Broytman -
Stefan Krah -
Steven D'Aprano -
Sunjay Varma