[Tutor] design question -- nested loops considered harmful?

Brian van den Broek bvande at po-box.mcgill.ca
Tue Nov 30 05:49:23 CET 2004


Hi all,

thanks to Chad and Liam for the replies. Posts merged for reply:

Chad Crabtree said unto the world upon 2004-11-29 21:46:

<SNIP>

 > Perhaps the only refinement that I have picked up recently is internal
 > functions. Perhaps this would be more readable without cluttering your
 > name space.
 >
 > def pars_file(list_of_lines):
 > ####internal FunctionS###########
 >     def check_flags(line,flags=item_flags,adict=data_dict):
 >        for item in flags:
 >           if line.startswith(item):
 >              adict[item]=line[len(item)]
 >              return
 > ####end internal functions####
 > ####start main function suite####
 >     data_dict={}
 >     for line in list_of_lines:
 >        check_flags(line)
 >     return data_dict
 >
 > Granted this adds a layer of indirection, but for more complex
 > examples I find this helpful when I need to look at this later,
 > because it hides some of the nesting.

I could see how nested defs would help with namespace issues -- it does 
seem one way to avoid modifying globals or passing about a bunch of 
parameters and multi-itemed returns. In the case I posted, I'm not sure 
about the payoff, but I've been tempted to nest defs when dealing with 
more complicated logic. I've refrained, as I've heard tell that nested 
defs are widely thought to be "a bad thing". I don't know why, but I 
*do* know I've already lived to regret disregarding such bits of 
conventional wisdom in the past. So, I've decided to act as though I 
understood the point until either I do, or I come across a case where I 
*really* feel I need them.

If anyone could either explain the conventional wisdom or set straight 
my belief that distaste for nested defs is indeed widespread and well 
founded, I'd be grateful.


Liam Clarke said unto the world upon 2004-11-29 22:29:

<SNIP>

>>Assuming that there would only be one occurrence of each one in a file -
>>
>>x=file('Brian's source file') 'r')
>>a=x.readlines() #How big is it? If it's a huge file, this may not be the best
>>x.close()
>>a="".join(a) #Turns a list into a string, the "" is the joiner, i.e.
>>                 # a=["Hello","World"] would become "HelloWorld"
>>whereas a="?M?".join(a)
>>                 #would become "Hello?M?World"
>>                 #Just had a thought - a=str(a) would do exactly the
>>same as a="".join(a)
>>                 #wouldn't it?
>>item_flags=["[Date]","[Email]"]

<SNIP>

I perhaps should have made clear that the toy code I posted was more 
aimed to exhibit flow control than actual parsing functionality. It is, 
however, something rather like code I've actually used, though. But when 
I've use that structure, I've had enough info about the datafiles to 
make much of Liam's code unneeded. (Thanks for posting it though :-) 
When I've used that structure, I've had short datafiles that were either 
created by me or by a freeware app with a documented file format. So, I 
can disregard file size, and count on the delimiter being at the front 
of the line. Hence, joining the lines into a string would make it 
harder. But maybe the approach Liam posted might well work better for 
the general case, where it is not known that my constraints obtain.

As for your embedded question, Liam: when in doubt, try it out ;-)

 >>> list_of_strings = ['a string\n', 'another one\n', 'yet one more\n']
 >>> a = ''.join(list_of_strings)
 >>> b = str(list_of_strings)
 >>> a == b
False
 >>> a
'a string\nanother one\nyet one more\n'
 >>> b
"['a string\\n', 'another one\\n', 'yet one more\\n']"

str(some_list) preserves the "listiness" of some_list in that it 
includes the '[', etc.

Thanks again, fellows. Best to all,

Brian vdB

PS I forgot to mention in my earlier post today -- the list input on 
datetime was indeed very helpful. Got it all sorted and all I'm left 
with is a sense of puzzlement as to why I was leery of the datetime 
module :-)


More information about the Tutor mailing list