[Moin-devel] RE: RE: Performance Importing Pages
matthew warren
matthew_j_warren at hotmail.com
Sun May 29 03:56:32 EDT 2005
Hallo,
Yes I agree it looks like it shouldnt work, but it does. Anyway, I altrered
my code to include them info as you have in your code. IE;
for PageName in UniqHeadings:
request = RequestCLI(url='localhost/tsmwiki',pagename=PageName)
editor = PageEditor(request,PageName)
filename, revision, exists = editor.get_rev()
if exists:
raise "Page Already Exists"
editor.saveText(editor.normalizeText(Articles[PageName]), 0)
..yet I still only get around 1 page every 10 seconds on a P4 3.4Ghz. and
thats a long time for 4,500 pages.
The wiki is served from a local Apache instance. When serving the Wiki,
performance and response seems fine. I have disbale firewalls /
viruscheckers, just in case they were interfering.
Matthew.
I think you miss something - looks like this code will not work - how
the code knows where is your wiki data dir with no url?
I tested this code on my idle machine (G5 Dual 2G):
import sys
sys.path = [# The path to the wiki directory
"/Volumes/Home/nir/wiki/fix",
# The path to moinmoin, not needed if its installed with
setup.py
"/Volumes/Home/nir/Projects/moin/fix"] + sys.path
from MoinMoin.PageEditor import PageEditor
from MoinMoin.request import RequestCLI
def save(url, pagename, text):
request = RequestCLI(url=url, pagename=pagename)
editor = PageEditor(request, pagename)
text = editor.normalizeText(text)
dummy, revision, exists = editor.get_rev()
return editor.saveText(text, revision)
import os
url = "localhost/fix/"
dir = "imports"
files = [name for name in os.listdir(dir)
if not name.startswith(".")]
for name in files:
text = file(os.path.join(dir, name)).read()
save(url, name, text)
_________________________________________________________________
It's fast, it's easy and it's free. Get MSN Messenger 7.0 today!
http://messenger.msn.co.uk
More information about the Moin-devel
mailing list