On Thu, Oct 19, 2017 at 10:05 AM, Stephan Houben <stephanh42@gmail.com> wrote:
Hi Steve,

2017-10-19 1:59 GMT+02:00 Steven D'Aprano <steve@pearwood.info>:
On Wed, Oct 18, 2017 at 02:51:37PM +0200, Stefan Krah wrote:

> $ softlimit -m 1000000000 python3
[...]
> MemoryError
>
>
> People who are worried could make a python3 alias or use Ctrl-\.

I just tried that on two different Linux computers I have, and neither
have softlimit.


Yeah, not sure what "softlimit" is either.
I'd suggest sticking to POSIX-standard ulimit or just stick
something like this in the .pythonrc.py:

import resource
resource.setrlimit(resource.RLIMIT_DATA, (2 * 1024**3, 2 * 1024**3))

Nor (presumably) would this help Windows users.

I (quickly) tried to get something to work using the win32 package,
in particular the win32job functions.
However, it seems setting
"ProcessMemoryLimit" using win32job.SetInformationJobObject
had no effect
(i.e.  a subsequent win32job.QueryInformationJobObject
still showed the limit as 0)?

People with stronger Windows-fu may be aware what is going on here...

Stephan 

I wasn't aware Windows was capable of setting such limits in a per-process fashion. You gave me a good idea for psutil:
https://github.com/giampaolo/psutil/issues/1149
According to this cmdline tool:
....the limit should kick in only when the system memory is full, whatever that means:
<<-r n: Limit the resident set size to n bytes. This limit is not enforced unless physical memory is full.>>
...so that would explain why it had no effect.

--