JSON-encoding very long iterators
ian.g.kelly at gmail.com
Tue Sep 30 03:58:47 CEST 2014
On Mon, Sep 29, 2014 at 7:19 PM, <alfred at 54.org> wrote:
> I would like to add the ability to JSONEncode large iterators. Right now there is no way to do this without modifying the code.
> The JSONEncoder.default() doc string suggests to do this:
> For example, to support arbitrary iterators, you could
> implement default like this::
> def default(self, o):
> iterable = iter(o)
> except TypeError:
> return list(iterable)
> # Let the base class default method raise the TypeError
> return JSONEncoder.default(self, o)
> but this method requires the whole serialized object to fit in memory and it's a good chance that your iterator is an iterator to save on memory in the first place.
> By changing the code to accept iterators it is then possible to stream json as I did here:
> This would ideal if it were included in the standard library. Is there any reason why it shouldn't be?
This would cause things that aren't lists to be encoded as lists.
Sometimes that may be desirable, but in general if e.g. a file object
sneaks its way into your JSON encode call, it is more likely correct
to raise an error than to silently encode the file as if it were a
list of strings. So it should not be the default behavior. That said,
it sounds like it could be made easier to enable streaming from
iterators as an option for those cases where it's desired.
More information about the Python-list