[Moin-user] wget a wiki page and it's subpage

Nir Soffer nirs at freeshell.org
Tue Jan 31 06:21:10 EST 2006


On 31 Jan, 2006, at 15:42, Ralf Gross wrote:

> moin-dump doesn't care about the attachments. But I found
> http://moinmoin.wikiwikiweb.de/MoinDump, which dumps the whole wiki
> including attachments and fixing of the attachment paths. It'd be
> nice if I could limit the output to the pages I really need, but it's
> ok for now.

moin-dump get the full page list and then dump all pages. If you want 
partial dump, add page filtering by name in the loop. You can use a 
regular expression, or a list of pages, or whatever you like.


Best Regards,

Nir Soffer





More information about the Moin-user mailing list