On Mon, Aug 12, 2019 at 3:58 PM Chris Angelico <rosuav@gmail.com> wrote:
But if there is a way to support that use-case without the foot gun, then I think that's a better option.
It's more that there's no reason to go to great lengths to *stop* you
from encoding invalid JSON. For it to block it, it would have to
explicitly check for it, instead of simply assume that you aren't
doing anything to break it. Permissiveness is the simple option here.

sure -- which I presume is what the author of the "separator" parameter was thinking.

But to bring it down to the specific topic at hand:

If the goal is to provide a way to encode arbitrary precision Decimals in JSON, then you could:

* (1) support encoding the Decimal type directly (my suggestion)
* (2) support allowing users to control how they encode JSON numbers (Richards suggestion, I think)
* (3) support allowing users to control how they encode any JSON type.

Frankly, each of these is a simple option, but each is more permissive than the next. I suggest that the least permissive solution that address the goal should be used.


(1) would not fully solve the OP's use-case.

(2) would not solve the use case of people wanting to control the full unicode normalization (there's also escaping non-ascii characters that may require normalization as well)

Because these are different goals.

So what goals are trying to be accomplished needs to drive the decision.


Christopher Barker, PhD

Python Language Consulting
  - Teaching
  - Scientific Software Development
  - Desktop GUI and Web Development
  - wxPython, numpy, scipy, Cython