On Tue, Aug 13, 2019 at 7:55 AM Christopher Barker <email@example.com> wrote:
> That may mean the cat's out of the bag, and we can neglect any role of the json module to try to enforce valid JSON, but still...
The *decoder* will enforce valid JSON, but the *encoder* doesn't need
to stop you from doing what you've chosen to do.
It doesn't "need" to, but it would be nice. Having an encoder/decoder that can encode things that it can't decode seems a bit asymmetric to me.
Perhaps the goal of the json module is to decode valid JSON, and to encode an arbitrary set of JSON-like encodings, but I don't *think* that is a stated goal of the module.
l'm all for practicality -- if there is a use case we want to support, and supporting that use case suggests a solution that allows shoot-footing fine. But if there is a way to support that use-case without the foot gun, then I think that's a better option.
An example here -- some years back, when I didn't know a damn thing about XML (I still don't know much) I discovered that ElementTree made it easy to write invalid XML. Whether that was a good or bad decision to not have it do some checking I don't know, but it was surprising, and it does make the module less useful to someone that is not very familiar with XML.
And the current json module does do a pretty good job of ensuring that you get valid JSON now, and the fact that the "encode custom types" machinery is designed as it is (rather than providing a hook to generate "raw JSON", makes me think the original designers had that in mind.
In fact, I'm pretty sure that setting custom separators the only way to get it to generate invalid JSON now, Do we want to make more ways?
Maybe, maybe not.