String concatenation - which is the fastest way ?

przemolicc at poczta.fm przemolicc at poczta.fm
Tue Aug 16 11:38:00 CEST 2011


On Fri, Aug 12, 2011 at 11:25:06AM +0200, Stefan Behnel wrote:
> przemolicc at poczta.fm, 11.08.2011 16:39:
>> On Thu, Aug 11, 2011 at 02:48:43PM +0100, Chris Angelico wrote:
>>> On Thu, Aug 11, 2011 at 2:46 PM,<przemolicc at poczta.fm>  wrote:
>>>> This is the way I am going to use.
>>>> But what is the best data type to hold so many rows and then operate on them ?
>>>>
>>>
>>> List of strings. Take it straight from your Oracle interface and work
>>> with it directly.
>>
>> Can I use this list in the following way ?
>> subprocess_1 - run on list between 1 and 10000
>> subprocess_2 - run on list between 10001 and 20000
>> subprocess_3 - run on list between 20001 and 30000
>> etc
>> ...
>
> Sure. Just read the data as it comes in from the database and fill up a  
> chunk, then hand that on to a process. You can also distribute it in  
> smaller packets, just check what size gives the best throughput.

Since the performance is critical I wanted to use multiprocessing module.
But when I get all the source rows into one list of strings can I easly
share it between X processes ?


Regards
Przemyslaw Bak (przemol)



















































----------------------------------------------------------------
Najwieksza baza najtanszych ofert mieszkaniowych
http://linkint.pl/f2a0e



More information about the Python-list mailing list