Concurrent writes to the same file
Cameron Simpson
cs at zip.com.au
Thu Jul 11 02:31:58 EDT 2013
On 10Jul2013 22:57, Jason Friedman <jsf80238 at gmail.com> wrote:
| Other than using a database, what are my options for allowing two processes
| to edit the same file at the same time? When I say same time, I can accept
| delays. I considered lock files, but I cannot conceive of how I avoid race
| conditions.
Sure. (Assuming UNIX. Windows probably excludes both processes
having the file open at the one time, but that's not a show stopper.)
You need to use a lock file to ensure only one process accesses the file at a time.
While a process holds the lock, access the file.
There are two basic approaches here:
take lock
open file for write, modify the file, close it
release lock
both processes have the file open for update
take lock
modify file, flush buffers
release lock
The code I use to take a lockfile is my "lockfile" context manager:
https://bitbucket.org/cameron_simpson/css/src/374f650025f156554a986fb3fd472003d2a2519a/lib/python/cs/fileutils.py?at=default#cl-408
An example caller goes:
with lockfile(self.csvpath):
backup = "%s.bak-%s" % (self.csvpath, datetime.datetime.now().isoformat())
copyfile(self.csvpath, backup)
write_csv_file(self.csvpath, self.nodedb.nodedata())
if not self.keep_backups:
os.remove(backup)
as shown here:
https://bitbucket.org/cameron_simpson/css/src/374f650025f156554a986fb3fd472003d2a2519a/lib/python/cs/nodedb/csvdb.py?at=default#cl-171
Simplify as necessary; you just want:
with lockfile(mainfile):
modify(mainfile)
if you have no wish for backups in case of a failed modify.
If both processes need to see each other's changes then there's
some more logic needed to monitor the file for changes. I've got
such a scheme for CSV files in beta at present for a personal
project. It won't suit all use cases; mine is well defined.
Cheers,
--
Cameron Simpson <cs at zip.com.au>
Is it true, Sen. Bedfellow, that your wife rides with bikers? - Milo Bloom
More information about the Python-list
mailing list