I have a Python script that automatically downloads zip files containing large datasets from another server and then unzips the files to further process the data.
It has been used as a geoprocessor of ArcGIS Server.
The script works fine when two datasets each has several kilobytes size, but the script stops half way when datasets were about 11,000KBytes.
I think that the execution time is too long and ArcGIS Server just simply killed the process.
What actions can I try to reduce the execution time?
ArcGIS Server only works on the basis of 32 bits and I was told that the maximum memory it can utilise is 4 MBytes.
I should be grateful if someone can make suggestions/recommendations.