Metadata debate for ages
On Tue, Feb 11, 2014 at 4:33 PM, Paul Moore <notifications@github.com>wrote:
and I doubt that will happen, as the metadata debate has been ongoing for ages, and it's *very* late to be coming up with requirements with significant implications out of the blue like this
This is from thread with my rant about versioning, which set a dumb question in my head. What is the role of dot zero part in metadata versioning? I mean if there is so much debates, how about choosing a minimal first thing that can be released? Reach consensus, release. Increase version. Go another round of debate. Release. Test. Increment version. Release. It seems that people are trying to make it perfect loosing the real conflicting points in a ton of information. I am looking at 50+ pages PEP in http://www.python.org/dev/peps/pep-0426/ and the first thing I notice that I don't understand why there are provisions like MUST and why PEP named "Metadata" regulates what tools should or should not do and what tools should exists at all. To me, the "Metadata 2.0" PEP should specify only two things: 1. format(s) of the data, how it can be represented so that people can process it 2. structure of the data - a set of fields, their values and cases when they are applied That's it. The things like "Automated tools, especially public index servers, MAY impose additional length restrictions on metadata beyond those enumerated in this PEP. Such limits SHOULD be imposed where necessary to protect the integrity of a service, based on the available resources and the service provider's judgment of reasonable metadata capacity requirements." are out of scope of Metadata at all. This stuff belongs elsewhere. If there will be a problem that metadata grows too large - it should be recorded as an issue, and the next version of metadata should say - fixed issues #..., #..., ... or just include a separate PEP for those who don't know how to handle the load on their servers. What do you think? -- anatoly t.
participants (1)
-
anatoly techtonik