Best approach to create humongous amount of files

Cem Karan cfkaran2 at gmail.com
Thu May 21 23:42:31 EDT 2015


On May 20, 2015, at 7:44 AM, Parul Mogra <scoria.799 at gmail.com> wrote:

> Hello everyone,
> My objective is to create large amount of data files (say a million *.json files), using a pre-existing template file (*.json). Each file would have a unique name, possibly by incorporating time stamp information. The files have to be generated in a folder specified.
> 
> What is the best strategy to achieve this task, so that the files will be generated in the shortest possible time? Say within an hour.

If you absolutely don't care about the name, then something like the following will work:

    import uuid
    for counter in range(1000000):
        with open(uuid.uuid1().hex.upper() + ".json", "w") as f:
            f.write(templateString)

where templateString is the template you want to write to each file.  The only problem is that the files won't be in any particular order; they'll just be uniquely named.  As a test, I ran the code above, but I killed the loop after about 10 minutes, at which point about 500,000 files were created.  Note that my laptop is about 6 years old, so you might get better performance on your machine.

Thanks,
Cem Karan


More information about the Python-list mailing list