[BangPypers] BangPypers Digest, Vol 41, Issue 27
Vikram Kamath
vikram.kmth at gmail.com
Sun Jan 16 23:04:01 CET 2011
Well, you could always try a different method of multiplication. A common
trick would be to use block multiplication.
http://en.wikipedia.org/wiki/Block_matrix
On Sun, Jan 16, 2011 at 4:30 PM, <bangpypers-request at python.org> wrote:
> Send BangPypers mailing list submissions to
> bangpypers at python.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://mail.python.org/mailman/listinfo/bangpypers
> or, via email, send a message with subject or body 'help' to
> bangpypers-request at python.org
>
> You can reach the person managing the list at
> bangpypers-owner at python.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of BangPypers digest..."
>
>
> Today's Topics:
>
> 1. Multiplying very large matrices (kunal ghosh)
> 2. Re: Multiplying very large matrices (Santosh Rajan)
> 3. Re: Multiplying very large matrices (kunal ghosh)
> 4. Re: Multiplying very large matrices (kunal ghosh)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Sat, 15 Jan 2011 22:11:21 +0530
> From: kunal ghosh <kunal.t2 at gmail.com>
> To: Bangalore Python Users Group - India <bangpypers at python.org>
> Subject: [BangPypers] Multiplying very large matrices
> Message-ID:
> <AANLkTin0TWKncS7WaBq581NmLA2QO+hotjbbDS94+MKy at mail.gmail.com<AANLkTin0TWKncS7WaBq581NmLA2QO%2BhotjbbDS94%2BMKy at mail.gmail.com>
> >
> Content-Type: text/plain; charset=ISO-8859-1
>
> Hi all,
> while implementing Locality Preserving Projections ,
> at one point i have to perform X L X.transpose()
> these matrices are large (32256 x 32256) so i get "out of memory" error.
>
> I assume, as the dataset gets larger one would come across this problem ,
> how would
> one go about solving this ? Is there a common trick that is used to deal
> with such problems ?
> Or the workstation calculating these problems needs to have HUGE amounts
> of
> physical memory ?
>
> I am using python and numpy / scipy
>
> --
> regards
> -------
> Kunal Ghosh
> Dept of Computer Sc. & Engineering.
> Sir MVIT
> Bangalore,India
>
> permalink: member.acm.org/~kunal.t2
> Blog:kunalghosh.wordpress.com
> Website:www.kunalghosh.net46.net
>
>
> ------------------------------
>
> Message: 2
> Date: Sat, 15 Jan 2011 23:12:01 +0530
> From: Santosh Rajan <santrajan at gmail.com>
> To: Bangalore Python Users Group - India <bangpypers at python.org>
> Subject: Re: [BangPypers] Multiplying very large matrices
> Message-ID:
> <AANLkTikQ-yWRJJFDKGfW_t8_8-r2O0WXY9DmUXbFZEjk at mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> Hope this helps
> http://stackoverflow.com/questions/1053928/python-numpy-very-large-matrices
>
> On Sat, Jan 15, 2011 at 10:11 PM, kunal ghosh <kunal.t2 at gmail.com> wrote:
> > Hi all,
> > while implementing Locality Preserving Projections ,
> > at one point i have to perform X L X.transpose()
> > these matrices are large (32256 x 32256) so i get "out of memory" error.
> >
> > I assume, as the dataset gets larger one would come across this problem ,
> > how would
> > one go about solving this ? Is there a common trick that is used to deal
> > with such problems ?
> > Or the workstation calculating these problems needs to have HUGE ?amounts
> of
> > physical memory ?
> >
> > I am using python and numpy / scipy
> >
> > --
> > regards
> > -------
> > Kunal Ghosh
> > Dept of Computer Sc. & Engineering.
> > Sir MVIT
> > Bangalore,India
> >
> > permalink: member.acm.org/~kunal.t2
> > Blog:kunalghosh.wordpress.com
> > Website:www.kunalghosh.net46.net
> > _______________________________________________
> > BangPypers mailing list
> > BangPypers at python.org
> > http://mail.python.org/mailman/listinfo/bangpypers
> >
>
>
>
> --
> http://about.me/santosh.rajan
>
>
> ------------------------------
>
> Message: 3
> Date: Sat, 15 Jan 2011 23:34:41 +0530
> From: kunal ghosh <kunal.t2 at gmail.com>
> To: Bangalore Python Users Group - India <bangpypers at python.org>
> Subject: Re: [BangPypers] Multiplying very large matrices
> Message-ID:
> <AANLkTinVmsKaWNx=rxYABx5UoSOtsvBb3eWfPN5o4WYx at mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
>
> Thanks Santosh ,
>
> This stack overflow thread indeed discusses the exact same problem i have.
> Wonder how i missed it :) in my preliminary searches.
>
> thanks again !
>
> On Sat, Jan 15, 2011 at 11:12 PM, Santosh Rajan <santrajan at gmail.com>
> wrote:
>
> > Hope this helps
> >
> http://stackoverflow.com/questions/1053928/python-numpy-very-large-matrices
> >
> > On Sat, Jan 15, 2011 at 10:11 PM, kunal ghosh <kunal.t2 at gmail.com>
> wrote:
> > > Hi all,
> > > while implementing Locality Preserving Projections ,
> > > at one point i have to perform X L X.transpose()
> > > these matrices are large (32256 x 32256) so i get "out of memory"
> error.
> > >
> > > I assume, as the dataset gets larger one would come across this problem
> ,
> > > how would
> > > one go about solving this ? Is there a common trick that is used to
> deal
> > > with such problems ?
> > > Or the workstation calculating these problems needs to have HUGE
> amounts
> > of
> > > physical memory ?
> > >
> > > I am using python and numpy / scipy
> > >
> > > --
> > > regards
> > > -------
> > > Kunal Ghosh
> > > Dept of Computer Sc. & Engineering.
> > > Sir MVIT
> > > Bangalore,India
> > >
> > > permalink: member.acm.org/~kunal.t2
> > > Blog:kunalghosh.wordpress.com
> > > Website:www.kunalghosh.net46.net
> > > _______________________________________________
> > > BangPypers mailing list
> > > BangPypers at python.org
> > > http://mail.python.org/mailman/listinfo/bangpypers
> > >
> >
> >
> >
> > --
> > http://about.me/santosh.rajan
> > _______________________________________________
> > BangPypers mailing list
> > BangPypers at python.org
> > http://mail.python.org/mailman/listinfo/bangpypers
> >
>
>
>
> --
> regards
> -------
> Kunal Ghosh
> Dept of Computer Sc. & Engineering.
> Sir MVIT
> Bangalore,India
>
> permalink: member.acm.org/~kunal.t2
> Blog:kunalghosh.wordpress.com
> Website:www.kunalghosh.net46.net
>
>
> ------------------------------
>
> Message: 4
> Date: Sun, 16 Jan 2011 12:32:01 +0530
> From: kunal ghosh <kunal.t2 at gmail.com>
> To: Bangalore Python Users Group - India <bangpypers at python.org>
> Subject: Re: [BangPypers] Multiplying very large matrices
> Message-ID:
> <AANLkTim5YiJ63oRHNtJMHNi4XAug+bx2mfX2Redva0iU at mail.gmail.com<AANLkTim5YiJ63oRHNtJMHNi4XAug%2Bbx2mfX2Redva0iU at mail.gmail.com>
> >
> Content-Type: text/plain; charset=ISO-8859-1
>
> Hi all,
> I found numpy.memmap
> to be very suitable when matrices larger than the physical memory are
> required.
>
> 1. included in standard numpy installation
> 2. very low learning curve.
>
> pyTables seems to be more suitable but , i somehow found the learning curve
> too steep . Also pyTables needs lot of initializations before anything can
> be done with it
> as compared to memmap.
>
> The above reason made me use memmap over pyTable.
>
> On Sat, Jan 15, 2011 at 11:34 PM, kunal ghosh <kunal.t2 at gmail.com> wrote:
>
> > Thanks Santosh ,
> >
> > This stack overflow thread indeed discusses the exact same problem i
> have.
> > Wonder how i missed it :) in my preliminary searches.
> >
> > thanks again !
> >
> >
> > On Sat, Jan 15, 2011 at 11:12 PM, Santosh Rajan <santrajan at gmail.com
> >wrote:
> >
> >> Hope this helps
> >>
> >>
> http://stackoverflow.com/questions/1053928/python-numpy-very-large-matrices
> >>
> >> On Sat, Jan 15, 2011 at 10:11 PM, kunal ghosh <kunal.t2 at gmail.com>
> wrote:
> >> > Hi all,
> >> > while implementing Locality Preserving Projections ,
> >> > at one point i have to perform X L X.transpose()
> >> > these matrices are large (32256 x 32256) so i get "out of memory"
> error.
> >> >
> >> > I assume, as the dataset gets larger one would come across this
> problem
> >> ,
> >> > how would
> >> > one go about solving this ? Is there a common trick that is used to
> deal
> >> > with such problems ?
> >> > Or the workstation calculating these problems needs to have HUGE
> >> amounts of
> >> > physical memory ?
> >> >
> >> > I am using python and numpy / scipy
> >> >
> >> > --
> >> > regards
> >> > -------
> >> > Kunal Ghosh
> >> > Dept of Computer Sc. & Engineering.
> >> > Sir MVIT
> >> > Bangalore,India
> >> >
> >> > permalink: member.acm.org/~kunal.t2
> >> > Blog:kunalghosh.wordpress.com
> >> > Website:www.kunalghosh.net46.net
> >> > _______________________________________________
> >> > BangPypers mailing list
> >> > BangPypers at python.org
> >> > http://mail.python.org/mailman/listinfo/bangpypers
> >> >
> >>
> >>
> >>
> >> --
> >> http://about.me/santosh.rajan
> >> _______________________________________________
> >> BangPypers mailing list
> >> BangPypers at python.org
> >> http://mail.python.org/mailman/listinfo/bangpypers
> >>
> >
> >
> >
> > --
> > regards
> > -------
> > Kunal Ghosh
> > Dept of Computer Sc. & Engineering.
> > Sir MVIT
> > Bangalore,India
> >
> > permalink: member.acm.org/~kunal.t2
> > Blog:kunalghosh.wordpress.com
> > Website:www.kunalghosh.net46.net
> >
> >
> >
>
>
> --
> regards
> -------
> Kunal Ghosh
> Dept of Computer Sc. & Engineering.
> Sir MVIT
> Bangalore,India
>
> permalink: member.acm.org/~kunal.t2
> Blog:kunalghosh.wordpress.com
> Website:www.kunalghosh.net46.net
>
>
> ------------------------------
>
> _______________________________________________
> BangPypers mailing list
> BangPypers at python.org
> http://mail.python.org/mailman/listinfo/bangpypers
>
>
> End of BangPypers Digest, Vol 41, Issue 27
> ******************************************
>
More information about the BangPypers
mailing list