I have written a short PEP as a complement/alternative to PEP 549. I will be grateful for comments and suggestions. The PEP should appear online soon. -- Ivan *********************************************************** PEP: 562 Title: Module __getattr__ Author: Ivan Levkivskyi <levkivskyi@gmail.com> Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Sep-2017 Python-Version: 3.7 Post-History: 09-Sep-2017 Abstract ======== It is proposed to support ``__getattr__`` function defined on modules to provide basic customization of module attribute access. Rationale ========= It is sometimes convenient to customize or otherwise have control over access to module attributes. A typical example is managing deprecation warnings. Typical workarounds are assigning ``__class__`` of a module object to a custom subclass of ``types.ModuleType`` or substituting ``sys.modules`` item with a custom wrapper instance. It would be convenient to simplify this procedure by recognizing ``__getattr__`` defined directly in a module that would act like a normal ``__getattr__`` method, except that it will be defined on module *instances*. For example:: # lib.py from warnings import warn deprecated_names = ["old_function", ...] def _deprecated_old_function(arg, other): ... def __getattr__(name): if name in deprecated_names: warn(f"{name} is deprecated", DeprecationWarning) return globals()[f"_deprecated_{name}"] raise AttributeError(f"module {__name__} has no attribute {name}") # main.py from lib import old_function # Works, but emits the warning There is a related proposal PEP 549 that proposes to support instance properties for a similar functionality. The difference is this PEP proposes a faster and simpler mechanism, but provides more basic customization. An additional motivation for this proposal is that PEP 484 already defines the use of module ``__getattr__`` for this purpose in Python stub files, see [1]_. Specification ============= The ``__getattr__`` function at the module level should accept one argument which is a name of an attribute and return the computed value or raise an ``AttributeError``:: def __getattr__(name: str) -> Any: ... This function will be called only if ``name`` is not found in the module through the normal attribute lookup. The reference implementation for this PEP can be found in [2]_. Backwards compatibility and impact on performance ================================================= This PEP may break code that uses module level (global) name ``__getattr__``. The performance implications of this PEP are minimal, since ``__getattr__`` is called only for missing attributes. References ========== .. [1] PEP 484 section about ``__getattr__`` in stub files (https://www.python.org/dev/peps/pep-0484/#stub-files) .. [2] The reference implementation (https://github.com/ilevkivskyi/cpython/pull/3/files) Copyright ========= This document has been placed in the public domain.
I'd really love to find a way to enable lazy loading by default, maybe with a way to opt-out old/problem/legacy modules instead of opt-in via __future__ or anything else. IME easily 95%+ of modules in the wild, today, will not even notice (I wrote an application bundler in past that enabled it globally by default without much fuss for several years). The only small annoyance is when it does cause problems the error can jump around, depending on how the lazy import was triggered. module.__getattr__ works pretty well for normal access, after being imported by another module, but it doesn't properly trigger loading by functions defined in the module's own namespace. These functions are bound to module.__dict__ as their __globals__ so lazy loading of this variety is really dependant on a custom module.__dict__ that implements __getitem__ or __missing__. I think this approach is the ideal path over existing PEPs. I've done it in the past and it worked very well. The impl looked something like this: * Import statements and __import__ generate lazy-load marker objects instead of performing imports. * Marker objects record the desired import and what identifiers were supposed to be added to namespace. * module.__dict__.__setitem__ recognizes markers and records their identifiers as lazily imported somewhere, but **does not add them to namespace**. * module.___getattribute__ will request the lazy attribute via module.__dict__ like regular objects and functions will request via their bound __globals__. * Both will trigger module.__dict.__missing__, which looks to see if the requested identifier was previously marked as a lazy import, and if so, performs the import, saves to namespace properly, and returns the real import. -- C Anthony On Sep 10, 2017 1:49 PM, "Ivan Levkivskyi" <levkivskyi@gmail.com> wrote: I have written a short PEP as a complement/alternative to PEP 549. I will be grateful for comments and suggestions. The PEP should appear online soon. -- Ivan *********************************************************** PEP: 562 Title: Module __getattr__ Author: Ivan Levkivskyi <levkivskyi@gmail.com> Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Sep-2017 Python-Version: 3.7 Post-History: 09-Sep-2017 Abstract ======== It is proposed to support ``__getattr__`` function defined on modules to provide basic customization of module attribute access. Rationale ========= It is sometimes convenient to customize or otherwise have control over access to module attributes. A typical example is managing deprecation warnings. Typical workarounds are assigning ``__class__`` of a module object to a custom subclass of ``types.ModuleType`` or substituting ``sys.modules`` item with a custom wrapper instance. It would be convenient to simplify this procedure by recognizing ``__getattr__`` defined directly in a module that would act like a normal ``__getattr__`` method, except that it will be defined on module *instances*. For example:: # lib.py from warnings import warn deprecated_names = ["old_function", ...] def _deprecated_old_function(arg, other): ... def __getattr__(name): if name in deprecated_names: warn(f"{name} is deprecated", DeprecationWarning) return globals()[f"_deprecated_{name}"] raise AttributeError(f"module {__name__} has no attribute {name}") # main.py from lib import old_function # Works, but emits the warning There is a related proposal PEP 549 that proposes to support instance properties for a similar functionality. The difference is this PEP proposes a faster and simpler mechanism, but provides more basic customization. An additional motivation for this proposal is that PEP 484 already defines the use of module ``__getattr__`` for this purpose in Python stub files, see [1]_. Specification ============= The ``__getattr__`` function at the module level should accept one argument which is a name of an attribute and return the computed value or raise an ``AttributeError``:: def __getattr__(name: str) -> Any: ... This function will be called only if ``name`` is not found in the module through the normal attribute lookup. The reference implementation for this PEP can be found in [2]_. Backwards compatibility and impact on performance ================================================= This PEP may break code that uses module level (global) name ``__getattr__``. The performance implications of this PEP are minimal, since ``__getattr__`` is called only for missing attributes. References ========== .. [1] PEP 484 section about ``__getattr__`` in stub files (https://www.python.org/dev/peps/pep-0484/#stub-files) .. [2] The reference implementation (https://github.com/ilevkivskyi/cpython/pull/3/files) Copyright ========= This document has been placed in the public domain. _______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
The main two use cases I know of for this and PEP 549 are lazy imports of submodules, and deprecating attributes. If we assume that you only want lazy imports to show up in dir() and don't want deprecated attributes to show up in dir() (and I'm not sure this is what you want 100% of the time, but it seems like the most reasonable default to me), then currently you need one of the PEPs for one of the cases and the other PEP for the other case. Would it make more sense to add direct support for lazy imports and attribute deprecation to ModuleType? This might look something like metamodule's FancyModule type: https://github.com/njsmith/metamodule/blob/ ee54d49100a9a06ffff341bb10a4d3549642139f/metamodule.py#L20 -n On Sep 10, 2017 11:49, "Ivan Levkivskyi" <levkivskyi@gmail.com> wrote:
I have written a short PEP as a complement/alternative to PEP 549. I will be grateful for comments and suggestions. The PEP should appear online soon.
-- Ivan
***********************************************************
PEP: 562 Title: Module __getattr__ Author: Ivan Levkivskyi <levkivskyi@gmail.com> Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Sep-2017 Python-Version: 3.7 Post-History: 09-Sep-2017
Abstract ========
It is proposed to support ``__getattr__`` function defined on modules to provide basic customization of module attribute access.
Rationale =========
It is sometimes convenient to customize or otherwise have control over access to module attributes. A typical example is managing deprecation warnings. Typical workarounds are assigning ``__class__`` of a module object to a custom subclass of ``types.ModuleType`` or substituting ``sys.modules`` item with a custom wrapper instance. It would be convenient to simplify this procedure by recognizing ``__getattr__`` defined directly in a module that would act like a normal ``__getattr__`` method, except that it will be defined on module *instances*. For example::
# lib.py
from warnings import warn
deprecated_names = ["old_function", ...]
def _deprecated_old_function(arg, other): ...
def __getattr__(name): if name in deprecated_names: warn(f"{name} is deprecated", DeprecationWarning) return globals()[f"_deprecated_{name}"] raise AttributeError(f"module {__name__} has no attribute {name}")
# main.py
from lib import old_function # Works, but emits the warning
There is a related proposal PEP 549 that proposes to support instance properties for a similar functionality. The difference is this PEP proposes a faster and simpler mechanism, but provides more basic customization. An additional motivation for this proposal is that PEP 484 already defines the use of module ``__getattr__`` for this purpose in Python stub files, see [1]_.
Specification =============
The ``__getattr__`` function at the module level should accept one argument which is a name of an attribute and return the computed value or raise an ``AttributeError``::
def __getattr__(name: str) -> Any: ...
This function will be called only if ``name`` is not found in the module through the normal attribute lookup.
The reference implementation for this PEP can be found in [2]_.
Backwards compatibility and impact on performance =================================================
This PEP may break code that uses module level (global) name ``__getattr__``. The performance implications of this PEP are minimal, since ``__getattr__`` is called only for missing attributes.
References ==========
.. [1] PEP 484 section about ``__getattr__`` in stub files (https://www.python.org/dev/peps/pep-0484/#stub-files)
.. [2] The reference implementation (https://github.com/ilevkivskyi/cpython/pull/3/files)
Copyright =========
This document has been placed in the public domain.
_______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
Sorry for top posting! I'm on a phone. I still think the better way to solve the custom dir() would be to change the module __dir__ method to check if __all__ is defined and use it to generate the result if it exists. This seems like a logical enhancement to me, and I'm planning on writing a patch to implement this. Whether it would be accepted is still an open issue though. Cody On Sep 10, 2017 3:19 PM, "Nathaniel Smith" <njs@pobox.com> wrote: The main two use cases I know of for this and PEP 549 are lazy imports of submodules, and deprecating attributes. If we assume that you only want lazy imports to show up in dir() and don't want deprecated attributes to show up in dir() (and I'm not sure this is what you want 100% of the time, but it seems like the most reasonable default to me), then currently you need one of the PEPs for one of the cases and the other PEP for the other case. Would it make more sense to add direct support for lazy imports and attribute deprecation to ModuleType? This might look something like metamodule's FancyModule type: https://github.com/njsmith/metamodule/blob/ee54d49100a9a06 ffff341bb10a4d3549642139f/metamodule.py#L20 -n
It looks simple and easy to understand. To achieve lazy import without breaking backward compatibility, I want to add one more rule: When package defines both of __getattr__ and __all__, automatic import of submodules are disabled (sorry, I don't have pointer to specification about this behavior). For example, some modules depends on email.parser or email.feedparser. But since email/__init__.py uses __all__, all submodules are imported eagerly. See https://github.com/python/cpython/blob/master/Lib/email/__init__.py#L7-L25 Changing __all__ will break backward compatibility. With __getattr__, this can be lazy import: import importlib def __getattr__(name): if name in __all__: return importlib.import_module("." + name, __name__) raise AttributeError(f"module {__name__!r} has no attribute {name!r}") Regards,
I don't think submodules are automatically imported, unless there are import statements in __init__.py. On Sun, Sep 10, 2017 at 8:08 PM, INADA Naoki <songofacandy@gmail.com> wrote:
It looks simple and easy to understand.
To achieve lazy import without breaking backward compatibility, I want to add one more rule: When package defines both of __getattr__ and __all__, automatic import of submodules are disabled (sorry, I don't have pointer to specification about this behavior).
For example, some modules depends on email.parser or email.feedparser. But since email/__init__.py uses __all__, all submodules are imported eagerly.
See https://github.com/python/cpython/blob/master/Lib/email/ __init__.py#L7-L25
Changing __all__ will break backward compatibility. With __getattr__, this can be lazy import:
import importlib
def __getattr__(name): if name in __all__: return importlib.import_module("." + name, __name__) raise AttributeError(f"module {__name__!r} has no attribute {name!r}")
Regards,
_______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
-- --Guido van Rossum (python.org/~guido)
Oh, I'm shame myself. Only when `from email import *` is used, __all__ submodules are imported. INADA Naoki <songofacandy@gmail.com> On Mon, Sep 11, 2017 at 12:17 PM, Guido van Rossum <guido@python.org> wrote:
I don't think submodules are automatically imported, unless there are import statements in __init__.py.
On Sun, Sep 10, 2017 at 8:08 PM, INADA Naoki <songofacandy@gmail.com> wrote:
It looks simple and easy to understand.
To achieve lazy import without breaking backward compatibility, I want to add one more rule: When package defines both of __getattr__ and __all__, automatic import of submodules are disabled (sorry, I don't have pointer to specification about this behavior).
For example, some modules depends on email.parser or email.feedparser. But since email/__init__.py uses __all__, all submodules are imported eagerly.
See https://github.com/python/cpython/blob/master/Lib/email/__init__.py#L7-L25
Changing __all__ will break backward compatibility. With __getattr__, this can be lazy import:
import importlib
def __getattr__(name): if name in __all__: return importlib.import_module("." + name, __name__) raise AttributeError(f"module {__name__!r} has no attribute {name!r}")
Regards,
_______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
-- --Guido van Rossum (python.org/~guido)
There's no need for shame! I regularly find out that there are Python features I didn't know about. It's called perpetual learning. :-) On Sun, Sep 10, 2017 at 9:02 PM, INADA Naoki <songofacandy@gmail.com> wrote:
Oh, I'm shame myself.
Only when `from email import *` is used, __all__ submodules are imported. INADA Naoki <songofacandy@gmail.com>
On Mon, Sep 11, 2017 at 12:17 PM, Guido van Rossum <guido@python.org> wrote:
I don't think submodules are automatically imported, unless there are import statements in __init__.py.
On Sun, Sep 10, 2017 at 8:08 PM, INADA Naoki <songofacandy@gmail.com> wrote:
It looks simple and easy to understand.
To achieve lazy import without breaking backward compatibility, I want to add one more rule: When package defines both of __getattr__
and
__all__, automatic import of submodules are disabled (sorry, I don't have pointer to specification about this behavior).
For example, some modules depends on email.parser or email.feedparser. But since email/__init__.py uses __all__, all submodules are imported eagerly.
See https://github.com/python/cpython/blob/master/Lib/email/ __init__.py#L7-L25
Changing __all__ will break backward compatibility. With __getattr__, this can be lazy import:
import importlib
def __getattr__(name): if name in __all__: return importlib.import_module("." + name, __name__) raise AttributeError(f"module {__name__!r} has no attribute {name!r}")
Regards,
_______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
-- --Guido van Rossum (python.org/~guido)
-- --Guido van Rossum (python.org/~guido)
@Anthony
module.__getattr__ works pretty well for normal access, after being imported by another module, but it doesn't properly trigger loading by functions defined in the module's own namespace.
The idea of my PEP is to be very simple (both semantically and in terms of implementation). This is why I don't want to add any complex logic. People who will want to use __getattr__ for lazy loading still can do this by importing submodules. @Nathaniel @INADA
The main two use cases I know of for this and PEP 549 are lazy imports of submodules, and deprecating attributes.
Yes, lazy loading seems to be a popular idea :-) I will add the simple recipe by Inada to the PEP since it will already work. @Cody
I still think the better way to solve the custom dir() would be to change the module __dir__ method to check if __all__ is defined and use it to generate the result if it exists. This seems like a logical enhancement to me, and I'm planning on writing a patch to implement this. Whether it would be accepted is still an open issue though.
This seems a reasonable rule to me, I can also make this patch if you will not have time. @Guido What do you think about the above idea? -- Ivan
If I recall there was a proposal a few months for a "lazy" keyword that would render anything lazy, including imports. Instead of just adding laziness on generators, the on imports, then who knows where, maybe it's time to consider laziness is a hell of a good general concept and try to generalize it ? For imports, that would mean: lazy from module import stuff lazy import foo For the rest, bar = lazy 1 + 1 When you think about it, it's syntaxic sugar to avoid manually wrapping everything in functions, storying stuff in closure and calling that later. Le 12/09/2017 à 10:26, Ivan Levkivskyi a écrit :
@Anthony
module.__getattr__ works pretty well for normal access, after being imported by another module, but it doesn't properly trigger loading by functions defined in the module's own namespace.
The idea of my PEP is to be very simple (both semantically and in terms of implementation). This is why I don't want to add any complex logic. People who will want to use __getattr__ for lazy loading still can do this by importing submodules.
@Nathaniel @INADA
The main two use cases I know of for this and PEP 549 are lazy imports of submodules, and deprecating attributes.
Yes, lazy loading seems to be a popular idea :-) I will add the simple recipe by Inada to the PEP since it will already work.
@Cody
I still think the better way to solve the custom dir() would be to change the module __dir__ method to check if __all__ is defined and use it to generate the result if it exists. This seems like a logical enhancement to me, and I'm planning on writing a patch to implement this. Whether it would be accepted is still an open issue though.
This seems a reasonable rule to me, I can also make this patch if you will not have time.
@Guido What do you think about the above idea?
-- Ivan
_______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
Picking up this thread as part of the PEP 562 and PEP 549 review. I like PEP 562 most, but I propose to add special-casing for `__dir__`. Not quite as proposed above (making the C level module_dir() look for `__all__`) but a bit more general -- making module_dir() look for `__dir__` and call that if present and callable. Ivan what do you think of that idea? It should be simple to add to your existing implementation. ( https://github.com/ilevkivskyi/cpython/pull/3#issuecomment-343591293) On Tue, Sep 12, 2017 at 1:26 AM, Ivan Levkivskyi <levkivskyi@gmail.com> wrote:
@Anthony
module.__getattr__ works pretty well for normal access, after being imported by another module, but it doesn't properly trigger loading by functions defined in the module's own namespace.
The idea of my PEP is to be very simple (both semantically and in terms of implementation). This is why I don't want to add any complex logic. People who will want to use __getattr__ for lazy loading still can do this by importing submodules.
@Nathaniel @INADA
The main two use cases I know of for this and PEP 549 are lazy imports of submodules, and deprecating attributes.
Yes, lazy loading seems to be a popular idea :-) I will add the simple recipe by Inada to the PEP since it will already work.
@Cody
I still think the better way to solve the custom dir() would be to change the module __dir__ method to check if __all__ is defined and use it to generate the result if it exists. This seems like a logical enhancement to me, and I'm planning on writing a patch to implement this. Whether it would be accepted is still an open issue though.
This seems a reasonable rule to me, I can also make this patch if you will not have time.
@Guido What do you think about the above idea?
-- Ivan
-- --Guido van Rossum (python.org/~guido)
On 10 November 2017 at 22:27, Guido van Rossum <guido@python.org> wrote:
Picking up this thread as part of the PEP 562 and PEP 549 review. I like PEP 562 most, but I propose to add special-casing for `__dir__`. Not quite as proposed above (making the C level module_dir() look for `__all__`) but a bit more general -- making module_dir() look for `__dir__` and call that if present and callable. Ivan what do you think of that idea? It should be simple to add to your existing implementation. (https://github.com/ ilevkivskyi/cpython/pull/3#issuecomment-343591293)
I like this idea. I was thinking about yet another option: *extending* the result of current dir() search by contents __all__ if present (not just returning contents of __all__). But it looks like your idea covers more use cases, so I would stick with your idea. -- Ivan
Wouldn't a better approach be a way to customize the type of the module? That would allow people to define behavior for almost anything (__call__, __getattr__, __setattr__, __dir__, various operators etc). This question shouldn't exist "why can't I customize behavior X in a module when I can do it for a class". Why go half-way. Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro On Sun, Sep 10, 2017 at 9:48 PM, Ivan Levkivskyi <levkivskyi@gmail.com> wrote:
I have written a short PEP as a complement/alternative to PEP 549. I will be grateful for comments and suggestions. The PEP should appear online soon.
-- Ivan
***********************************************************
PEP: 562 Title: Module __getattr__ Author: Ivan Levkivskyi <levkivskyi@gmail.com> Status: Draft Type: Standards Track Content-Type: text/x-rst Created: 09-Sep-2017 Python-Version: 3.7 Post-History: 09-Sep-2017
Abstract ========
It is proposed to support ``__getattr__`` function defined on modules to provide basic customization of module attribute access.
Rationale =========
It is sometimes convenient to customize or otherwise have control over access to module attributes. A typical example is managing deprecation warnings. Typical workarounds are assigning ``__class__`` of a module object to a custom subclass of ``types.ModuleType`` or substituting ``sys.modules`` item with a custom wrapper instance. It would be convenient to simplify this procedure by recognizing ``__getattr__`` defined directly in a module that would act like a normal ``__getattr__`` method, except that it will be defined on module *instances*. For example::
# lib.py
from warnings import warn
deprecated_names = ["old_function", ...]
def _deprecated_old_function(arg, other): ...
def __getattr__(name): if name in deprecated_names: warn(f"{name} is deprecated", DeprecationWarning) return globals()[f"_deprecated_{name}"] raise AttributeError(f"module {__name__} has no attribute {name}")
# main.py
from lib import old_function # Works, but emits the warning
There is a related proposal PEP 549 that proposes to support instance properties for a similar functionality. The difference is this PEP proposes a faster and simpler mechanism, but provides more basic customization. An additional motivation for this proposal is that PEP 484 already defines the use of module ``__getattr__`` for this purpose in Python stub files, see [1]_.
Specification =============
The ``__getattr__`` function at the module level should accept one argument which is a name of an attribute and return the computed value or raise an ``AttributeError``::
def __getattr__(name: str) -> Any: ...
This function will be called only if ``name`` is not found in the module through the normal attribute lookup.
The reference implementation for this PEP can be found in [2]_.
Backwards compatibility and impact on performance =================================================
This PEP may break code that uses module level (global) name ``__getattr__``. The performance implications of this PEP are minimal, since ``__getattr__`` is called only for missing attributes.
References ==========
.. [1] PEP 484 section about ``__getattr__`` in stub files (https://www.python.org/dev/peps/pep-0484/#stub-files)
.. [2] The reference implementation (https://github.com/ilevkivskyi/cpython/pull/3/files)
Copyright =========
This document has been placed in the public domain.
_______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
On Sep 12, 2017 7:08 AM, "Ionel Cristian Mărieș via Python-ideas" < python-ideas@python.org> wrote: Wouldn't a better approach be a way to customize the type of the module? That would allow people to define behavior for almost anything (__call__, __getattr__, __setattr__, __dir__, various operators etc). This question shouldn't exist "why can't I customize behavior X in a module when I can do it for a class". Why go half-way. If you're ok with replacing the object in sys.modules then the ability to totally customize your module's type has existed since the dawn era. And if you're not ok with that, then it's still existed since 3.5 via the mechanism of assigning to __class__ to change the type in-place. So this discussion isn't about adding new functionality per se, but about trying to find some way to provide a little bit of sugar that provides most of the value in a less obscure way. (And unfortunately there's a chicken and egg problem for using custom module types *without* the __class__ assignment hack, because you can't load any code from a package until after you've created the top level module object. So we've kind of taken custom module types as far as they can go already.) -n
On 2017-09-12, Nathaniel Smith wrote:
If you're ok with replacing the object in sys.modules then the ability to totally customize your module's type has existed since the dawn era. And if you're not ok with that, then it's still existed since 3.5 via the mechanism of assigning to __class__ to change the type in-place.
It doesn't quite work though. Swapping out or assigning to __class__, and then running: exec(code, module.__dict__) does not have the expected behavior. LOAD_NAME/LOAD_GLOBAL does not care about your efforts. Accessing module globals from outside the module does work as that is a getattr call. That is a weird inconsistency that should be fixed if it is not too painful. Coming up with handy syntax or whatever is a minor problem.
On Tue, Sep 12, 2017 at 10:32 PM Nathaniel Smith <njs@pobox.com> wrote:
If you're ok with replacing the object in sys.modules then the ability to totally customize your module's type has existed since the dawn era.
I'm down with that. Just make it easier, mucking with sys.modules ain't a walk in a park, and there's the boilerplate and the crazy issues with interpreter shutdown. -- Thanks, -- Ionel Cristian Mărieș, http://blog.ionelmc.ro
10.09.17 21:48, Ivan Levkivskyi пише:
# lib.py
from warnings import warn
deprecated_names = ["old_function", ...]
def _deprecated_old_function(arg, other): ...
def __getattr__(name): if name in deprecated_names: warn(f"{name} is deprecated", DeprecationWarning) return globals()[f"_deprecated_{name}"] raise AttributeError(f"module {__name__} has no attribute {name}")
# main.py
from lib import old_function # Works, but emits the warning
I think the PEP should provide better examples, because they will be copied in third-party code. For keeping all functionality (in particularly be pickleable) the deprecated function should preserve its name. def old_function(arg, other): ... _deprecated_old_function = old_function del old_function or def _deprecated_old_function(arg, other): ... _deprecated_old_function.__name__ = 'old_function' _deprecated_old_function.__qualname__ = 'old_function' (I prefer the former variant.) I'm wondering if it is worth to provide a special helper that will rename the deprecated function and create the corresponding __getattr__ function. It could be possible to create helpers that implement module-level properties too.
On Tue, Nov 14, 2017 at 12:24 AM, Serhiy Storchaka <storchaka@gmail.com> wrote:
10.09.17 21:48, Ivan Levkivskyi пише:
# lib.py
from warnings import warn
deprecated_names = ["old_function", ...]
def _deprecated_old_function(arg, other): ...
def __getattr__(name): if name in deprecated_names: warn(f"{name} is deprecated", DeprecationWarning) return globals()[f"_deprecated_{name}"] raise AttributeError(f"module {__name__} has no attribute {name}")
# main.py
from lib import old_function # Works, but emits the warning
I think the PEP should provide better examples, because they will be copied in third-party code.
For keeping all functionality (in particularly be pickleable) the deprecated function should preserve its name.
def old_function(arg, other): ... _deprecated_old_function = old_function del old_function
or
def _deprecated_old_function(arg, other): ... _deprecated_old_function.__name__ = 'old_function' _deprecated_old_function.__qualname__ = 'old_function'
(I prefer the former variant.)
However the latter could be done by a decorator.
I'm wondering if it is worth to provide a special helper that will rename the deprecated function and create the corresponding __getattr__ function.
It could be possible to create helpers that implement module-level properties too.
OTOH not everyone cares about whether their functions are picklable. Maybe a discussion of how to make deprecated functions picklable could be in an Appendix? And another example of how to implement more functional lazy loading. Or even more complete examples could be maintained as a separate GitHub repo (like Ethan did for PEP 561). -- --Guido van Rossum (python.org/~guido)
participants (10)
-
C Anthony Risinger
-
Cody Piersall
-
Guido van Rossum
-
INADA Naoki
-
Ionel Cristian Mărieș
-
Ivan Levkivskyi
-
Michel Desmoulin
-
Nathaniel Smith
-
Neil Schemenauer
-
Serhiy Storchaka