python file synchronization

silentnights silentquote at
Tue Feb 7 10:33:21 CET 2012

Hi All,

I have the following problem, I have an appliance (A) which generates
records and write them into file (X), the appliance is accessible
throw ftp from a server (B). I have another central server (C) that
runs a Django App, that I need to get continuously the records from
file (A).

The problems are as follows:
1. (A) is heavily writing to the file, so copying the file will result
of uncompleted line at the end.
2. I have many (A)s and (B)s  that I need to get the data from.
3. I can't afford losing any records from file (X)

My current implementation is as follows:
1. Server (B) copy the file (X) throw FTP.
2. Server (B) make a copy of file (X) to file (Y.time_stamp) ignoring
the last line to avoid incomplete lines.
3. Server (B) periodically make copies of file (X) and copy the lines
starting from previous ignored line to file (Y.time_stamp)

4. Server (C) mounts the diffs_dir locally.
5. Server (C) create file (Y.time_stamp.lock) on target_dir then copy
file (Y.time_stamp) to local target_dir then delete

6. A deamon running in Server (C) read file list from the target_dir,
and process those file that doesn't have a matching *.lock file, this
procedure to avoid reading the file until It's completely copied.

The above is implemented and working, the problem is that It required
so many syncs and has a high overhead and It's hard to debug.

I greatly appreciate your thoughts and suggestions.

Lastly I want to note that am not a programming guru, still a noob,
but I am trying to learn from the experts. :-)

More information about the Python-list mailing list