Help with finding how much memory a variable is using

Rich Somerfield rich_somerfield at tertio.com
Mon Feb 26 10:09:46 CET 2001


I dont think compression would be the way to go as this would cause massive
problems with performance.

Technical Info:
The reason for the huge list is basically the AI tree of board
representations for AI processing.  If the programme was specialised to a
particular game then I could implement a space saving method of storing the
data.  However as the code uses a config file to determine the rules of the
game (hence game is unknown until runtime) the AI engine has to be
completely generic - otherwise the code would be useless!!

Look at this URL for more detailed info (and a download) :
www.geocities.com/djterrier/SGEMain.htm

The format of the list is :
thegamelist  = [ [node, [board_representation] ] , [.....] , [.....],
[.....] , ..... ]
thenodelist = [ [node, [ children_nodes ] , [.....] , [.....] , [.....] ,
..... ]

If anybody can suggest a better method of storing the data [bear in mind
that many functions need access to the data at each node for processing /
checking].  Also bear in mind that due to the interpreted nature of Python,
and the performance demanding nature of the application, any suggested
implementation would need not adversely effect the performance.

Regards
Rich.



Steve Purcell <stephen_purcell at yahoo.com> wrote in message
news:mailman.982950492.15473.python-list at python.org...
> Rich Somerfield wrote:
> > I am generating a huge, huge list of data (cant think of a decent way to
> > reduce the required storage size and have it in a usable form).
Everytime i
> > try to keep this list for future derivations I get a memory problem from
> > Windows (effectively terminating my python script).
> >
> > I presume this is because of the huge list.  Is it possible to find out
the
> > amount of memory a variable [not a type of the variable, the actual data
> > contained within the variable] is taking up?
>
> A huge, huge in-memory list is very likely to be the cause of such a
memory
> problem! <wink>
>
> Compression isn't going to help you much if the list is so big that even
> a tenth of its size would still be too big.
>
> Post some more technical information and I'm sure you'll get more helpful
> suggestions/alternatives than you could have wished for*.
>
> -Steve
>
> * or wanted
>
> --
> Steve Purcell, Pythangelist
> Get testing at http://pyunit.sourceforge.net/
> Get servlets at http://pyserv.sourceforge.net/
> "Even snakes are afraid of snakes." -- Steven Wright
>





More information about the Python-list mailing list