I’ve been carefully reading this thread (and the previous threads on the subject), and I’m wondering if we’re not over-engineering the whole backward/forward compatibility thing.
With Python modules there is often dependencies on other modules. Therefore, both backward and forward compatibility is important, otherwise if your module depends on dep1 and dep2, and dep1 has been ported to the new world but dep2 has not you are basically hosed: you cannot take your module to the new world and you cannot remain in the old world. Therefore, things like six were wonderful, because they allowed module creators to have Python 2 compatibility at little developer cost.
But for Python extension modules there is much less interdependency. Actually the only cases I can think of off the top of my head is things in the scipy/numpy area. Because there is much less interdependency it means that the value of backward compatibility is much less. (Note I’m not saying “no value”, only “less value”).
That means that a simple measure, such as supplying a header file that provides backward compatibility back to 3.6 and telling extension module distributors to copy this file into their distribution may be just as good as a more complex solution. And if you want to get fancy you could add a script “genpycompatheader” which creates that include file in future Python distributions.