[Moin-user] wget a wiki page and it's subpage

Thomas Waldmann tw-public at gmx.de
Tue Jan 31 04:17:05 EST 2006


> I'm trying to get a wiki page and its subpage with many images through
> wget.

Some user agents (including wget) receive special treatment as people 
use them often to DOS wikis. So make sure you change the user agent it 
uses (and if it is not your own wiki: USE CAREFULLY).

> I need to do this, because we want to give the pages as documentation
> to a customer. After retrieving the pages in html I'm going to try to
> convert them to pdf.

Maybe look at moin-dump, too.





More information about the Moin-user mailing list