@evan And it is a spectacularly bad take.
There were over two dozen network hypertext systems before the Web. Best known was Gopher but there was also HyperG and quite a few more. Only one to survive was Adobe Acrobat.
Same was true of SAML, we had 20 vendors selling AAA schemes. SAML made 21.
The reason it is not a problem is that it is market share that is important, not the number of standards. If you have 20 standards and none of them meets more than 20% of use cases, you are never going to get to a ubiquitous standard.
If you have a spec that can meet 90% of use cases and none of the incumbents has more than 20% market share, then it stands a really good chance of becoming the dominant standard.
I have thought about this problem. JSON is not quite what is needed because binary floating point numbers don't round trip and so saving the data changes the results.
I proposed a set of tiered extensions to JSON that are all supersets, every JSON-B deserializer will accept JSON as well. JSON-B adds binary encoding so data doesn't expand with every encryption pass due to the need for Base64 encoding.
JSON-C adds compression. JSON-D adds binary encoding of floating point values in IEEE format and also Intel extended floating point.
There are many binary JSON encodings around. Mine is the only one that is backwards compatible as far as I know.
https://www.ietf.org/archive/id/draft-hallambaker-jsonbcd-23.html