Creating huge data in very less time.

andrea kerny404 at
Tue Mar 31 12:30:33 CEST 2009

On 31 Mar, 12:14, "venutaurus... at" <venutaurus... at>
> That time is reasonable. The randomness should be in such a way that
> MD5 checksum of no two files should be the same.The main reason for
> having such a huge data is for doing stress testing of our product.

In randomness is not necessary (as I understood) you can just create
one single file and then modify one bit of it iteratively for 1000
It's enough to make the checksum change.

Is there a way to create a file to big withouth actually writing
anything in python (just give me the garbage that is already on the

More information about the Python-list mailing list