make sublists of a list broken at nth certain list items

Joshua Landau joshua.landau.ws at gmail.com
Mon Jul 8 23:24:09 CEST 2013


On 8 July 2013 21:52, CM <cmpython at gmail.com> wrote:
> I'm looking for a Pythonic way to do the following:
>
> I have data in the form of a long list of tuples.  I would like to break that list into four sub-lists.  The break points would be based on the nth occasion of a particular tuple.  (The list represents behavioral data trials; the particular tuple represents the break between trials; I want to collect 20 trials at a time, so every 20th break between trials, start a new sublist).

I would do this like so:

from collections import deque

# Fast and hacky -- just how I like it
exhaust_iterable = deque(maxlen=0).extend

def chunk_of(data, *, length):
    count = 0
    for datum in data:
        count += datum[0] == 1

        yield datum

        if count == 60:
            break

def chunked(data):
    data = iter(data)
    while True:
        chunk = chunk_of(data, length=20)
        yield chunk
        exhaust_iterable(chunk)

You use "chunked(data)" and iterate over the 'chunks' in that. If you
go to the next chunk before finishing the one you're on the previous
chunk will be lost, so convert it to a permanent form first.

Looking at you code:

> for tup in data_list:
>     if tup[0] == 1.0: #Therefore the start of a new trial
>
>         #We have a match!  Therefore get the index in the data_list
>         data_list_index = data_list.index(tup)

This is no good (ninja'd by Fábio). The proper way to keep an index is by:

for index, tup in enumerate(data_list):

>         trial_count += 1  #update the trial count.
>
>         if trial_count % 20 == 0:  #this will match on 0, 20, 40, 60, 80
>             trial_break_indexes_list.append(data_list_index)
>
> print 'This is trial_break_indexes_list: ', trial_break_indexes_list
...
> I sense there is a way more elegant/simpler/Pythonic way to approach this, let alone one that is actually correct, but I don't know of it.  Suggestions appreciated!

Yup.



More information about the Python-list mailing list