There has been talk in the past of setting up a cronjob somewhere that backs up all data stored at GitHub in some way for contingency purposes. I think it's a totally reasonable thing to do, but I don't plan to hold up the migration for it.

On Fri, 29 Jul 2016 at 04:09 Petr Viktorin <encukou@gmail.com> wrote:
On 07/28/2016 06:21 PM, Senthil Kumaran wrote:
> On Thu, Jul 28, 2016 at 7:48 AM, R. David Murray <rdmurray@bitdance.com> wrote:
>> The obvious alternative is to just post a link to the PR with some
>> summary about what was posted (perhaps who and how many comments?).  I
>> don't have an opinion at this point which would be better, but the core
>> of the proposed patch would be the same, just the output would be
>> different.
>
> This sounds better to me. If the person in the  nosy list is
> interested, they can be added to the nosy list of PR, so that  they
> can subsequent updates from there.

One case for copying comments is to create an archive for the case that
Github goes bankrupt or turns evil. This seems very unlikely now, but
Python should be planning for long-term contingencies. AFAIK there's no
formal agreement between the PSF and Github that PR archives will remain
accessible.

_______________________________________________
core-workflow mailing list
core-workflow@python.org
https://mail.python.org/mailman/listinfo/core-workflow
This list is governed by the PSF Code of Conduct: https://www.python.org/psf/codeofconduct