Looking for info on Python's memory allocation
steve at REMOVETHIScyber.com.au
Tue Oct 11 16:20:09 CEST 2005
On Tue, 11 Oct 2005 11:22:39 +0200, Lasse Vågsæther Karlsen wrote:
> This begs a different question along the same lines.
Er, no it doesn't. "Begs the question" does _not_ mean "asks the question"
or "suggests the question". It means "assumes the truth of that which
needs to be proven".
(Both of these sources are far more forgiving of the modern mis-usage than
I am. Obviously.)
> If I have a generator or other iterable producing a vast number of
> items, and use it like this:
> s = [k for k in iterable]
> if I know beforehand how many items iterable would possibly yield, would
> a construct like this be faster and "use" less memory?
> s =  * len(iterable)
> for i in xrange(len(iterable)):
> s[i] = iterable.next()
Faster? Maybe. Only testing can tell -- but I doubt it. But as for less
memory, look at the two situations.
In the first, you create a list of N objects.
In the second, you end up with the same list of N objects, plus an xrange
object, which may be bigger or smaller than an ordinary list of N
integers, depending on how large N is.
So it won't use *less* memory -- at best, it will use just slightly more.
Is there a way from within Python to find out how much memory a single
object uses, and how much memory Python is using in total?
More information about the Python-list