[Numpy-discussion] Tensor Typing presentation and discussion on Wednesday! (As part of the NumPy Community Meeting)
ALFONSO LUIS CASTAÑO MARIN
alfonsoluis.castanom at um.es
Tue Jun 15 09:59:06 EDT 2021
Hi Matthew & Juan,
Thank you for the kind words and for the feedback! I totally agree in
what you are suggesting, so I will just provide a little of context on
why it was presented that way.
First of all, when talking about type arithmetic I suggest to refer to
the latest presentation for the summit rather than from the Tensor
(https://docs.google.com/presentation/d/1NZzZfbUCXi78LDDe8C3hAaK9on8ZiX3Z2k4BlZXwsJ8/edit). Here I intentionally refer to the +,-,*,// syntax rather than the Add one. Moreover, I also write 1 + A rather than Literal + A, assuming that we could write a PEP for enabling this behavior if needed, and as Matthew said, it seems feasible based on what was said in the
So yes, we totally agree that is important to make the syntax as
simple as possible, especially because expressions can get quite
complicated (e.g. convolutions). Nonetheless, I will also answer the
question of "Why in the presentation of the Tensor Typing meeting you
used the Prefix operators (Add) and did not support directly Subtract?"
The answer is that the goal was to show what was tecnically possible
to achieve with Type Arithmetic, and syntax does not matter in that
regard. Since we knew that the syntax could change any point later on
we just went for the prefix one after a quick 30 minutes discussion in
Pyre. However, the implementation for doing A+B instead of Add[A,B]
would have been the same complex, and can be changed with a few lines
About Subtract, it was once again, because we did not need a custom
Subtract operator to show what type arithmetic can do. However,
supporting it would have been a matter of a few lines in Pyre, I even
had a PR ready but we decided to make as few changes as possible since
it would not add any new functionality. After all, the idea was how to
bring a big feature like Type Arithmetic with as few changes to the
language as possible.
Quoting Matthew Rahtz <mrahtz at google.com>:
> Hi Juan
> Thanks for the kind words!
>> why isn’t there a Subtract operator?
> I /think/ this is just because the implementation in Pyre is
> still an early prototype. Alfonso?
>> is there a prospect for using actual operators and actual literals here?
> That's a really good question. We're optimistic that actual
> literals should be possible - we assumed there was some subtle
> reason we'd have to use Literal, but when we briefly talked about it
> at PyCon, Guido was like "Nope, you should just write a PEP for
> that". As for actual operators - I can't think of any
> /technical/ reason this shouldn't be possible - when it comes time
> to write a PEP for type arithmetic it's definitely something we
> should propose. Alfonso, any thoughts?
> On Mon, 14 Jun 2021 at 03:51, Juan Nunez-Iglesias
> <jni at fastmail.com> wrote:
>> Thank you so much Matthew, Pradeep, and everyone else helping with
>> this effort!
>> And thanks Matthew for sharing the slides — very informative!
>> I’ve asked Sebastian for the recording but in the meantime I
>> thought I’d ask a couple of questions about the syntax:
>> * why isn’t there a Subtract operator? It seems that
>> Subtract[A, B] is much more readable than Add[A,
>> Multiply[Literal[-1], B]]?
>> * is there a prospect for using actual operators and actual
>> literals here? I think that a lot of reticence about Python typing
>> comes from the annotations being so difficult to read. There’s been
>> some great improvements recently with “Optional[Tuple[Union[Float,
>> Int]]]” becoming “tuple[float | int] | None”, and I hope we can
>> leapfrog the awkward phase with this tensor typing stuff. Again, A
>> - B is much more readable again than Subtract[A, B]. And B-2 is
>> much more readable than Add[B, Literal[-2]].
>> Other than that, love the work, thank you again!
>>> On 11 Jun 2021, at 5:08 am, Matthew Rahtz <mrahtz at google.com> wrote:
>>> Some links I promised to send around:
>>> * https://github.com/deepmind/tensor_annotations
>>> which has a bunch of experimental shape-aware stubs for TensorFlow
>>> and JAX using 'semantic' axis labels
>>> * https://github.com/pradeep90/pytorch_examples
>>> ditto for TensorFlow and PyTorch using value-based axis labels
>>> * https://arxiv.org/pdf/2102.13254.pdf Similar work
>>> in TensorFlow for Swift
>>> * Slides for our presentation
>>> * Tensor typing meeting details
>>> Thanks again, everyone!
>>> On Wed, 9 Jun 2021 at 22:28, Sebastian Berg
>>> <sebastian at sipsolutions.net> wrote:
>>>> On Tue, 2021-06-08 at 11:40 -0700, Stefan van der Walt wrote:
>>>>> On Tue, Jun 8, 2021, at 03:23, Juan Nunez-Iglesias wrote:
>>>>> > Hello! Any chance this could be recorded for those of us in useless
>>>>> > time zones? 😂
>>>>> I added a note to the agenda; we'll ask the speakers and record if
>>>>> they don't mind.
>>>> We have made a recording, but I did not plan on making it available
>>>> publicly. Please contact me directly if you are interested.
>>>>> NumPy-Discussion mailing list
>>>>> NumPy-Discussion at python.org
>>>> NumPy-Discussion mailing list
>>>> NumPy-Discussion at python.org
>>> NumPy-Discussion mailing list
>>> NumPy-Discussion at python.org
>> NumPy-Discussion mailing list
>> NumPy-Discussion at python.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the NumPy-Discussion