[Baypiggies] Programming Pattern/Django/Best Practice?
glen at glenjarvis.com
Fri Apr 3 20:23:33 CEST 2009
> I'm not so sure that ajax is the silver bullet in your situation.
> It will probably be a UI improvement and there are great reasons to
> use it, but the django implementation will end up being fairly
> similar. From my understanding of what you are asking, the
> subprocess on the server side could run for longer than any
> reasonable timeout for the ajax response, so you're still going to
> want to return a success response just to tell the user that the
> request was submitted. After that, returning a json (or xml)
> response via django to subsequent ajax requests will be no different
> than returning an html page.
Agreed. When I was writing this, it suddenly hit me that I was
ignoring any Ajax solution, so I threw it in. I was hoping that it may
be a silver bullet. But, within about three minutes of clicking send I
was thinking 'hmmmm.' I think I still hit send because subconsciously
I knew it wasn't all fixed yet.
> The good news is that either solution should be pretty simple using
> the standard django framework.
> Your urls.py entry doesn't have to be dynamically generated, you
> would just pass a unique job_id in the url, with a urls.py entry like:
> (r'^status/(?P<job_id>[0-9]+)$', myapp.views.status)
Of course! I was thinking of a static URL pattern -- especially since
I was originally thinking of temporary static pages. I wasn't thinking
of a job_id as much as I was a uniquely generated short string as a
'tiny url' .... But, using a pattern like this, that dynamic url
becomes much simpler. And, you have me thinking of a job_id instead of
a unique string, which may be even simpler. This was a great help,
thank you! I need to think about this a bit more...
> Unless I've misunderstood your requirements, it all looks pretty
> standard and inside the box from a django perspective, as long as
> the subtasks / subprocesses are well-behaved in reporting their
> status "somewhere" that your django instance can fetch the info.
Nope. You understood perfectly. And, the concepts you share weren't at
all new, I've used them in Django on other programs. But, I was very
'tunnel visioned' on my approach. You've really simplified things in
my mind for me.
> And if you're concerned about load / scale / redundancy, you'll want
> to make that "somewhere" a place that the instance of django
> reporting on the status doesn't need to be the one that started the
> job. A Job model would probably be where I'd start, although I
> obviously don't have that much info on your app. Fwiw, I've found
> using the django models and framework in a separate, non-webserver
> process to be quite straightforward as well.
Yes, this is my major concern in trying to be forward thinking. These
subprocesses may be poorly written (I'm writing a framework/API that
other developers may freely plug into), so I want all processes to be
managed by an API and run as separate as I can from everything else. I
don't want to have bad future code bring everything to a grinding
halt. This is why I prefer keeping these subprocesses as pure python
processes. I also was trying to cleanly separate so that this wasn't a
Subprocess of a Django process, but was a separately launched
(external) batch process.
I may be pre-optimizing (the root of all evil) a bit. And, I may not
need to separate the processes (although I'd prefer it). It may be
easier to leave this in Django.
Regardless, this seems to be a fairly common design pattern I've seen
over the years. Does it have a name? "Asynchronous web batch process
while grabbing final status report at end" is a bit awkward.
Thank you again for helping me clarify things as I start
architecting this project. More feedback now saves ripping bad code/
incomplete ideas out later as I design. And, I still seem to do a lot
of that... code, rip, code, rip, code until I finally get the design
that I like. I'd like to minimize that when I can.
More information about the Baypiggies