<div><div dir="auto">Thank you all for your feedback. </div><div dir="auto">The initial problem I came with wasnt the definition of PCA but what the sklearn method does. In practice I would always make sure the data is both centered and scaled before performing PCA. This is the recommended method because without scaling, the biggest direction could wrongly seem to explain a huge fraction of the variance. </div><div dir="auto">So my point was simply to clarify in the help file and the user guide what the PCA class does precisely to leave no unclarity to the reader. Moving forward I have now submitted a pull request on github as initially suggested by Roman on this thread. </div><div dir="auto">Best,</div><div dir="auto">Ismael</div><br><div class="gmail_quote"><div>On Mon, 16 Oct 2017 at 11:49 AM, <<a href="mailto:scikit-learn-request@python.org">scikit-learn-request@python.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Send scikit-learn mailing list submissions to<br>
<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a><br>
<br>
To subscribe or unsubscribe via the World Wide Web, visit<br>
<a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
or, via email, send a message with subject or body 'help' to<br>
<a href="mailto:scikit-learn-request@python.org" target="_blank">scikit-learn-request@python.org</a><br>
<br>
You can reach the person managing the list at<br>
<a href="mailto:scikit-learn-owner@python.org" target="_blank">scikit-learn-owner@python.org</a><br>
<br>
When replying, please edit your Subject line so it is more specific<br>
than "Re: Contents of scikit-learn digest..."<br>
<br>
<br>
Today's Topics:<br>
<br>
1. Re: 1. Re: unclear help file for sklearn.decomposition.pca<br>
(Andreas Mueller)<br>
2. Re: 1. Re: unclear help file for sklearn.decomposition.pca<br>
(Oliver Tomic)<br>
<br>
<br>
----------------------------------------------------------------------<br>
<br>
Message: 1<br>
Date: Mon, 16 Oct 2017 14:44:51 -0400<br>
From: Andreas Mueller <<a href="mailto:t3kcit@gmail.com" target="_blank">t3kcit@gmail.com</a>><br>
To: <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a><br>
Subject: Re: [scikit-learn] 1. Re: unclear help file for<br>
sklearn.decomposition.pca<br>
Message-ID: <<a href="mailto:35142868-fce9-6cb3-eba3-015a0b106163@gmail.com" target="_blank">35142868-fce9-6cb3-eba3-015a0b106163@gmail.com</a>><br>
Content-Type: text/plain; charset="utf-8"; Format="flowed"<br>
<br>
<br>
<br>
On 10/16/2017 02:27 PM, Ismael Lemhadri wrote:<br>
> @Andreas Muller:<br>
> My references do not assume centering, e.g.<br>
> <a href="http://ufldl.stanford.edu/wiki/index.php/PCA" rel="noreferrer" target="_blank">http://ufldl.stanford.edu/wiki/index.php/PCA</a><br>
> any reference?<br>
><br>
It kinda does but is not very clear about it:<br>
<br>
This data has already been pre-processed so that each of the<br>
features\textstyle x_1and\textstyle x_2have about the same mean (zero)<br>
and variance.<br>
<br>
<br>
<br>
Wikipedia is much clearer:<br>
Consider a datamatrix<br>
<<a href="https://en.wikipedia.org/wiki/Matrix_%28mathematics%29" rel="noreferrer" target="_blank">https://en.wikipedia.org/wiki/Matrix_%28mathematics%29</a>>,*X*, with<br>
column-wise zeroempirical mean<br>
<<a href="https://en.wikipedia.org/wiki/Empirical_mean" rel="noreferrer" target="_blank">https://en.wikipedia.org/wiki/Empirical_mean</a>>(the sample mean of each<br>
column has been shifted to zero), where each of the/n/rows represents a<br>
different repetition of the experiment, and each of the/p/columns gives<br>
a particular kind of feature (say, the results from a particular sensor).<br>
<a href="https://en.wikipedia.org/wiki/Principal_component_analysis#Details" rel="noreferrer" target="_blank">https://en.wikipedia.org/wiki/Principal_component_analysis#Details</a><br>
<br>
I'm a bit surprised to find that ESL says "The SVD of the centered<br>
matrix X is another way of expressing the principal components of the<br>
variables in X",<br>
so they assume scaling? They don't really have a great treatment of PCA,<br>
though.<br>
<br>
Bishop <<a href="http://www.springer.com/us/book/9780387310732" rel="noreferrer" target="_blank">http://www.springer.com/us/book/9780387310732</a>> and Murphy<br>
<<a href="https://mitpress.mit.edu/books/machine-learning-0" rel="noreferrer" target="_blank">https://mitpress.mit.edu/books/machine-learning-0</a>> are pretty clear<br>
that they subtract the mean (or assume zero mean) but don't standardize.<br>
-------------- next part --------------<br>
An HTML attachment was scrubbed...<br>
URL: <<a href="http://mail.python.org/pipermail/scikit-learn/attachments/20171016/81b3014b/attachment-0001.html" rel="noreferrer" target="_blank">http://mail.python.org/pipermail/scikit-learn/attachments/20171016/81b3014b/attachment-0001.html</a>><br>
<br>
------------------------------<br>
<br>
Message: 2<br>
Date: Mon, 16 Oct 2017 20:48:29 +0200<br>
From: Oliver Tomic <<a href="mailto:olivertomic@zoho.com" target="_blank">olivertomic@zoho.com</a>><br>
To: "Scikit-learn mailing list" <<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
Cc: <<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
Subject: Re: [scikit-learn] 1. Re: unclear help file for<br>
sklearn.decomposition.pca<br>
Message-ID: <<a href="mailto:15f26840d65.e97b33c25239.3934951873824890747@zoho.com" target="_blank">15f26840d65.e97b33c25239.3934951873824890747@zoho.com</a>><br>
Content-Type: text/plain; charset="utf-8"<br>
<br>
Dear Ismael,<br>
<br>
<br>
<br>
PCA should always involve at the least centering, or, if the variables are to contribute equally, scaling. Here is a reference from the scientific area named "chemometrics". In Chemometrics PCA used not only for dimensionality reduction, but also for interpretation of variance by use of scores, loadings, correlation loadings, etc.<br>
<br>
<br>
<br>
If you scroll down to subsection "Preprocessing" you will find more info on centering and scaling.<br>
<br>
<br>
<a href="http://pubs.rsc.org/en/content/articlehtml/2014/ay/c3ay41907j" rel="noreferrer" target="_blank">http://pubs.rsc.org/en/content/articlehtml/2014/ay/c3ay41907j</a><br>
<br>
<br>
<br>
best<br>
<br>
Oliver<br>
<br>
<br>
<br>
<br>
---- On Mon, 16 Oct 2017 20:27:11 +0200 Ismael Lemhadri &<a href="mailto:lt%3Blemhadri@stanford.edu" target="_blank">lt;lemhadri@stanford.edu</a>> wrote ----<br>
<br>
<br>
<br>
<br>
@Andreas Muller:<br>
<br>
My references do not assume centering, e.g. <a href="http://ufldl.stanford.edu/wiki/index.php/PCA" rel="noreferrer" target="_blank">http://ufldl.stanford.edu/wiki/index.php/PCA</a><br>
<br>
any reference?<br>
<br>
<br>
<br>
<br>
<br>
<br>
<br>
On Mon, Oct 16, 2017 at 10:20 AM, &<a href="mailto:lt%3Bscikit-learn-request@python.org" target="_blank">lt;scikit-learn-request@python.org</a>> wrote:<br>
<br>
Send scikit-learn mailing list submissions to<br>
<br>
<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a><br>
<br>
<br>
<br>
To subscribe or unsubscribe via the World Wide Web, visit<br>
<br>
<a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
or, via email, send a message with subject or body 'help' to<br>
<br>
<a href="mailto:scikit-learn-request@python.org" target="_blank">scikit-learn-request@python.org</a><br>
<br>
<br>
<br>
You can reach the person managing the list at<br>
<br>
<a href="mailto:scikit-learn-owner@python.org" target="_blank">scikit-learn-owner@python.org</a><br>
<br>
<br>
<br>
When replying, please edit your Subject line so it is more specific<br>
<br>
than "Re: Contents of scikit-learn digest..."<br>
<br>
<br>
<br>
<br>
<br>
Today's Topics:<br>
<br>
<br>
<br>
1. Re: unclear help file for sklearn.decomposition.pca<br>
<br>
(Andreas Mueller)<br>
<br>
<br>
<br>
<br>
<br>
----------------------------------------------------------------------<br>
<br>
<br>
<br>
Message: 1<br>
<br>
Date: Mon, 16 Oct 2017 13:19:57 -0400<br>
<br>
From: Andreas Mueller &<a href="mailto:lt%3Bt3kcit@gmail.com" target="_blank">lt;t3kcit@gmail.com</a>><br>
<br>
To: <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a><br>
<br>
Subject: Re: [scikit-learn] unclear help file for<br>
<br>
sklearn.decomposition.pca<br>
<br>
Message-ID: &<a href="mailto:lt%3B04fc445c-d8f3-a3a9-4ab2-0535826a2d03@gmail.com" target="_blank">lt;04fc445c-d8f3-a3a9-4ab2-0535826a2d03@gmail.com</a>><br>
<br>
Content-Type: text/plain; charset="utf-8"; Format="flowed"<br>
<br>
<br>
<br>
The definition of PCA has a centering step, but no scaling step.<br>
<br>
<br>
<br>
On 10/16/2017 11:16 AM, Ismael Lemhadri wrote:<br>
<br>
> Dear Roman,<br>
<br>
> My concern is actually not about not mentioning the scaling but about<br>
<br>
> not mentioning the centering.<br>
<br>
> That is, the sklearn PCA removes the mean but it does not mention it<br>
<br>
> in the help file.<br>
<br>
> This was quite messy for me to debug as I expected it to either: 1/<br>
<br>
> center and scale simultaneously or / not scale and not center either.<br>
<br>
> It would be beneficial to explicit the behavior in the help file in my<br>
<br>
> opinion.<br>
<br>
> Ismael<br>
<br>
><br>
<br>
> On Mon, Oct 16, 2017 at 8:02 AM, &<a href="mailto:lt%3Bscikit-learn-request@python.org" target="_blank">lt;scikit-learn-request@python.org</a><br>
<br>
> <mailto:<a href="mailto:scikit-learn-request@python.org" target="_blank">scikit-learn-request@python.org</a>>> wrote:<br>
<br>
><br>
<br>
> Send scikit-learn mailing list submissions to<br>
<br>
> <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
<br>
><br>
<br>
> To subscribe or unsubscribe via the World Wide Web, visit<br>
<br>
> <a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
> <<a href="https://mail.python.org/mailman/listinfo/scikit-learn>" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn></a>;<br>
<br>
> or, via email, send a message with subject or body 'help' to<br>
<br>
> <a href="mailto:scikit-learn-request@python.org" target="_blank">scikit-learn-request@python.org</a><br>
<br>
> <mailto:<a href="mailto:scikit-learn-request@python.org" target="_blank">scikit-learn-request@python.org</a>><br>
<br>
><br>
<br>
> You can reach the person managing the list at<br>
<br>
> <a href="mailto:scikit-learn-owner@python.org" target="_blank">scikit-learn-owner@python.org</a> <mailto:<a href="mailto:scikit-learn-owner@python.org" target="_blank">scikit-learn-owner@python.org</a>><br>
<br>
><br>
<br>
> When replying, please edit your Subject line so it is more specific<br>
<br>
> than "Re: Contents of scikit-learn digest..."<br>
<br>
><br>
<br>
><br>
<br>
> Today's Topics:<br>
<br>
><br>
<br>
> ? ?1. unclear help file for sklearn.decomposition.pca (Ismael<br>
<br>
> Lemhadri)<br>
<br>
> ? ?2. Re: unclear help file for sklearn.decomposition.pca<br>
<br>
> ? ? ? (Roman Yurchak)<br>
<br>
> ? ?3. Question about LDA's coef_ attribute (Serafeim Loukas)<br>
<br>
> ? ?4. Re: Question about LDA's coef_ attribute (Alexandre Gramfort)<br>
<br>
> ? ?5. Re: Question about LDA's coef_ attribute (Serafeim Loukas)<br>
<br>
><br>
<br>
><br>
<br>
> ----------------------------------------------------------------------<br>
<br>
><br>
<br>
> Message: 1<br>
<br>
> Date: Sun, 15 Oct 2017 18:42:56 -0700<br>
<br>
> From: Ismael Lemhadri &<a href="mailto:lt%3Blemhadri@stanford.edu" target="_blank">lt;lemhadri@stanford.edu</a><br>
<br>
> <mailto:<a href="mailto:lemhadri@stanford.edu" target="_blank">lemhadri@stanford.edu</a>>><br>
<br>
> To: <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
<br>
> Subject: [scikit-learn] unclear help file for<br>
<br>
> ? ? ? ? sklearn.decomposition.pca<br>
<br>
> Message-ID:<br>
<br>
> ? ? ? ?<br>
<br>
> <CANpSPFTgv+Oz7f97dandmrBBayqf_o9w=<a href="mailto:18oKHCFN0u5DNzj%2Bg@mail.gmail.com" target="_blank">18oKHCFN0u5DNzj+g@mail.gmail.com</a><br>
<br>
> <mailto:<a href="mailto:18oKHCFN0u5DNzj%252Bg@mail.gmail.com" target="_blank">18oKHCFN0u5DNzj%2Bg@mail.gmail.com</a>>><br>
<br>
> Content-Type: text/plain; charset="utf-8"<br>
<br>
><br>
<br>
> Dear all,<br>
<br>
> The help file for the PCA class is unclear about the preprocessing<br>
<br>
> performed to the data.<br>
<br>
> You can check on line 410 here:<br>
<br>
> <a href="https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/" rel="noreferrer" target="_blank">https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/</a><br>
<br>
> decomposition/pca.py#L410<br>
<br>
> <<a href="https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/%0Adecomposition/pca.py#L410>" rel="noreferrer" target="_blank">https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/%0Adecomposition/pca.py#L410></a>;<br>
<br>
> that the matrix is centered but NOT scaled, before performing the<br>
<br>
> singular<br>
<br>
> value decomposition.<br>
<br>
> However, the help files do not make any mention of it.<br>
<br>
> This is unclear for someone who, like me, just wanted to compare<br>
<br>
> that the<br>
<br>
> PCA and np.linalg.svd give the same results. In academic settings,<br>
<br>
> students<br>
<br>
> are often asked to compare different methods and to check that<br>
<br>
> they yield<br>
<br>
> the same results. I expect that many students have confronted this<br>
<br>
> problem<br>
<br>
> before...<br>
<br>
> Best,<br>
<br>
> Ismael Lemhadri<br>
<br>
> -------------- next part --------------<br>
<br>
> An HTML attachment was scrubbed...<br>
<br>
> URL:<br>
<br>
> <<a href="http://mail.python.org/pipermail/scikit-learn/attachments/20171015/c465bde7/attachment-0001.html" rel="noreferrer" target="_blank">http://mail.python.org/pipermail/scikit-learn/attachments/20171015/c465bde7/attachment-0001.html</a><br>
<br>
> <<a href="http://mail.python.org/pipermail/scikit-learn/attachments/20171015/c465bde7/attachment-0001.html>>" rel="noreferrer" target="_blank">http://mail.python.org/pipermail/scikit-learn/attachments/20171015/c465bde7/attachment-0001.html>></a>;<br>
<br>
><br>
<br>
> ------------------------------<br>
<br>
><br>
<br>
> Message: 2<br>
<br>
> Date: Mon, 16 Oct 2017 15:16:45 +0200<br>
<br>
> From: Roman Yurchak &<a href="mailto:lt%3Brth.yurchak@gmail.com" target="_blank">lt;rth.yurchak@gmail.com</a><br>
<br>
> <mailto:<a href="mailto:rth.yurchak@gmail.com" target="_blank">rth.yurchak@gmail.com</a>>><br>
<br>
> To: Scikit-learn mailing list &<a href="mailto:lt%3Bscikit-learn@python.org" target="_blank">lt;scikit-learn@python.org</a><br>
<br>
> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>>><br>
<br>
> Subject: Re: [scikit-learn] unclear help file for<br>
<br>
> ? ? ? ? sklearn.decomposition.pca<br>
<br>
> Message-ID: &<a href="mailto:lt%3Bb2abdcfd-4736-929e-6304-b93832932043@gmail.com" target="_blank">lt;b2abdcfd-4736-929e-6304-b93832932043@gmail.com</a><br>
<br>
> <mailto:<a href="mailto:b2abdcfd-4736-929e-6304-b93832932043@gmail.com" target="_blank">b2abdcfd-4736-929e-6304-b93832932043@gmail.com</a>>><br>
<br>
> Content-Type: text/plain; charset=utf-8; format=flowed<br>
<br>
><br>
<br>
> Ismael,<br>
<br>
><br>
<br>
> as far as I saw the sklearn.decomposition.PCA doesn't mention<br>
<br>
> scaling at<br>
<br>
> all (except for the whiten parameter which is post-transformation<br>
<br>
> scaling).<br>
<br>
><br>
<br>
> So since it doesn't mention it, it makes sense that it doesn't do any<br>
<br>
> scaling of the input. Same as np.linalg.svd.<br>
<br>
><br>
<br>
> You can verify that PCA and np.linalg.svd yield the same results, with<br>
<br>
><br>
<br>
> ```<br>
<br>
> ?>>> import numpy as np<br>
<br>
> ?>>> from sklearn.decomposition import PCA<br>
<br>
> ?>>> import numpy.linalg<br>
<br>
> ?>>> X = np.random.RandomState(42).rand(10, 4)<br>
<br>
> ?>>> n_components = 2<br>
<br>
> ?>>> PCA(n_components, svd_solver='full').fit_transform(X)<br>
<br>
> ```<br>
<br>
><br>
<br>
> and<br>
<br>
><br>
<br>
> ```<br>
<br>
> ?>>> U, s, V = np.linalg.svd(X - X.mean(axis=0), full_matrices=False)<br>
<br>
> ?>>> (X - X.mean(axis=0)).dot(V[:n_components].T)<br>
<br>
> ```<br>
<br>
><br>
<br>
> --<br>
<br>
> Roman<br>
<br>
><br>
<br>
> On 16/10/17 03:42, Ismael Lemhadri wrote:<br>
<br>
> > Dear all,<br>
<br>
> > The help file for the PCA class is unclear about the preprocessing<br>
<br>
> > performed to the data.<br>
<br>
> > You can check on line 410 here:<br>
<br>
> ><br>
<br>
> <a href="https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/decomposition/pca.py#L410" rel="noreferrer" target="_blank">https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/decomposition/pca.py#L410</a><br>
<br>
> <<a href="https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/decomposition/pca.py#L410>" rel="noreferrer" target="_blank">https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/decomposition/pca.py#L410></a>;<br>
<br>
> ><br>
<br>
> <<a href="https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/decomposition/pca.py#L410" rel="noreferrer" target="_blank">https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/decomposition/pca.py#L410</a><br>
<br>
> <<a href="https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/decomposition/pca.py#L410>>" rel="noreferrer" target="_blank">https://github.com/scikit-learn/scikit-learn/blob/ef5cb84a/sklearn/decomposition/pca.py#L410>></a>;<br>
<br>
> > that the matrix is centered but NOT scaled, before performing the<br>
<br>
> > singular value decomposition.<br>
<br>
> > However, the help files do not make any mention of it.<br>
<br>
> > This is unclear for someone who, like me, just wanted to compare<br>
<br>
> that<br>
<br>
> > the PCA and np.linalg.svd give the same results. In academic<br>
<br>
> settings,<br>
<br>
> > students are often asked to compare different methods and to<br>
<br>
> check that<br>
<br>
> > they yield the same results. I expect that many students have<br>
<br>
> confronted<br>
<br>
> > this problem before...<br>
<br>
> > Best,<br>
<br>
> > Ismael Lemhadri<br>
<br>
> ><br>
<br>
> ><br>
<br>
> > _______________________________________________<br>
<br>
> > scikit-learn mailing list<br>
<br>
> > <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
<br>
> > <a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
> <<a href="https://mail.python.org/mailman/listinfo/scikit-learn>" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn></a>;<br>
<br>
> ><br>
<br>
><br>
<br>
><br>
<br>
><br>
<br>
> ------------------------------<br>
<br>
><br>
<br>
> Message: 3<br>
<br>
> Date: Mon, 16 Oct 2017 15:27:48 +0200<br>
<br>
> From: Serafeim Loukas &<a href="mailto:lt%3Bseralouk@gmail.com" target="_blank">lt;seralouk@gmail.com</a> <mailto:<a href="mailto:seralouk@gmail.com" target="_blank">seralouk@gmail.com</a>>><br>
<br>
> To: <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
<br>
> Subject: [scikit-learn] Question about LDA's coef_ attribute<br>
<br>
> Message-ID: &<a href="mailto:lt%3B58C6D0DA-9DE5-4EF5-97C1-48159831F5A9@gmail.com" target="_blank">lt;58C6D0DA-9DE5-4EF5-97C1-48159831F5A9@gmail.com</a><br>
<br>
> <mailto:<a href="mailto:58C6D0DA-9DE5-4EF5-97C1-48159831F5A9@gmail.com" target="_blank">58C6D0DA-9DE5-4EF5-97C1-48159831F5A9@gmail.com</a>>><br>
<br>
> Content-Type: text/plain; charset="us-ascii"<br>
<br>
><br>
<br>
> Dear Scikit-learn community,<br>
<br>
><br>
<br>
> Since the documentation of the LDA<br>
<br>
> (<a href="http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html" rel="noreferrer" target="_blank">http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html</a><br>
<br>
> <<a href="http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html>" rel="noreferrer" target="_blank">http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html></a>;<br>
<br>
> <<a href="http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html" rel="noreferrer" target="_blank">http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html</a><br>
<br>
> <<a href="http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html>>" rel="noreferrer" target="_blank">http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html>></a>;)<br>
<br>
> is not so clear, I would like to ask if the lda.coef_ attribute<br>
<br>
> stores the eigenvectors from the SVD decomposition.<br>
<br>
><br>
<br>
> Thank you in advance,<br>
<br>
> Serafeim<br>
<br>
> -------------- next part --------------<br>
<br>
> An HTML attachment was scrubbed...<br>
<br>
> URL:<br>
<br>
> <<a href="http://mail.python.org/pipermail/scikit-learn/attachments/20171016/4263df5c/attachment-0001.html" rel="noreferrer" target="_blank">http://mail.python.org/pipermail/scikit-learn/attachments/20171016/4263df5c/attachment-0001.html</a><br>
<br>
> <<a href="http://mail.python.org/pipermail/scikit-learn/attachments/20171016/4263df5c/attachment-0001.html>>" rel="noreferrer" target="_blank">http://mail.python.org/pipermail/scikit-learn/attachments/20171016/4263df5c/attachment-0001.html>></a>;<br>
<br>
><br>
<br>
> ------------------------------<br>
<br>
><br>
<br>
> Message: 4<br>
<br>
> Date: Mon, 16 Oct 2017 16:57:52 +0200<br>
<br>
> From: Alexandre Gramfort &<a href="mailto:lt%3Balexandre.gramfort@inria.fr" target="_blank">lt;alexandre.gramfort@inria.fr</a><br>
<br>
> <mailto:<a href="mailto:alexandre.gramfort@inria.fr" target="_blank">alexandre.gramfort@inria.fr</a>>><br>
<br>
> To: Scikit-learn mailing list &<a href="mailto:lt%3Bscikit-learn@python.org" target="_blank">lt;scikit-learn@python.org</a><br>
<br>
> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>>><br>
<br>
> Subject: Re: [scikit-learn] Question about LDA's coef_ attribute<br>
<br>
> Message-ID:<br>
<br>
> ? ? ? ?<br>
<br>
> &<a href="mailto:lt%3BCADeotZricOQhuHJMmW2Z14cqffEQyndYoxn-OgKAvTMQ7V0Y2g@mail.gmail.com" target="_blank">lt;CADeotZricOQhuHJMmW2Z14cqffEQyndYoxn-OgKAvTMQ7V0Y2g@mail.gmail.com</a><br>
<br>
> <mailto:<a href="mailto:CADeotZricOQhuHJMmW2Z14cqffEQyndYoxn-OgKAvTMQ7V0Y2g@mail.gmail.com" target="_blank">CADeotZricOQhuHJMmW2Z14cqffEQyndYoxn-OgKAvTMQ7V0Y2g@mail.gmail.com</a>>><br>
<br>
> Content-Type: text/plain; charset="UTF-8"<br>
<br>
><br>
<br>
> no it stores the direction of the decision function to match the<br>
<br>
> API of<br>
<br>
> linear models.<br>
<br>
><br>
<br>
> HTH<br>
<br>
> Alex<br>
<br>
><br>
<br>
> On Mon, Oct 16, 2017 at 3:27 PM, Serafeim Loukas<br>
<br>
> &<a href="mailto:lt%3Bseralouk@gmail.com" target="_blank">lt;seralouk@gmail.com</a> <mailto:<a href="mailto:seralouk@gmail.com" target="_blank">seralouk@gmail.com</a>>> wrote:<br>
<br>
> > Dear Scikit-learn community,<br>
<br>
> ><br>
<br>
> > Since the documentation of the LDA<br>
<br>
> ><br>
<br>
> (<a href="http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html" rel="noreferrer" target="_blank">http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html</a><br>
<br>
> <<a href="http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html>" rel="noreferrer" target="_blank">http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html></a>;)<br>
<br>
> > is not so clear, I would like to ask if the lda.coef_ attribute<br>
<br>
> stores the<br>
<br>
> > eigenvectors from the SVD decomposition.<br>
<br>
> ><br>
<br>
> > Thank you in advance,<br>
<br>
> > Serafeim<br>
<br>
> ><br>
<br>
> > _______________________________________________<br>
<br>
> > scikit-learn mailing list<br>
<br>
> > <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
<br>
> > <a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
> <<a href="https://mail.python.org/mailman/listinfo/scikit-learn>" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn></a>;<br>
<br>
> ><br>
<br>
><br>
<br>
><br>
<br>
> ------------------------------<br>
<br>
><br>
<br>
> Message: 5<br>
<br>
> Date: Mon, 16 Oct 2017 17:02:46 +0200<br>
<br>
> From: Serafeim Loukas &<a href="mailto:lt%3Bseralouk@gmail.com" target="_blank">lt;seralouk@gmail.com</a> <mailto:<a href="mailto:seralouk@gmail.com" target="_blank">seralouk@gmail.com</a>>><br>
<br>
> To: Scikit-learn mailing list &<a href="mailto:lt%3Bscikit-learn@python.org" target="_blank">lt;scikit-learn@python.org</a><br>
<br>
> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>>><br>
<br>
> Subject: Re: [scikit-learn] Question about LDA's coef_ attribute<br>
<br>
> Message-ID: &<a href="mailto:lt%3B413210D2-56AE-41A4-873F-D171BB36539D@gmail.com" target="_blank">lt;413210D2-56AE-41A4-873F-D171BB36539D@gmail.com</a><br>
<br>
> <mailto:<a href="mailto:413210D2-56AE-41A4-873F-D171BB36539D@gmail.com" target="_blank">413210D2-56AE-41A4-873F-D171BB36539D@gmail.com</a>>><br>
<br>
> Content-Type: text/plain; charset="us-ascii"<br>
<br>
><br>
<br>
> Dear Alex,<br>
<br>
><br>
<br>
> Thank you for the prompt response.<br>
<br>
><br>
<br>
> Are the eigenvectors stored in some variable ?<br>
<br>
> Does the lda.scalings_ attribute contain the eigenvectors ?<br>
<br>
><br>
<br>
> Best,<br>
<br>
> Serafeim<br>
<br>
><br>
<br>
> > On 16 Oct 2017, at 16:57, Alexandre Gramfort<br>
<br>
> &<a href="mailto:lt%3Balexandre.gramfort@inria.fr" target="_blank">lt;alexandre.gramfort@inria.fr</a> <mailto:<a href="mailto:alexandre.gramfort@inria.fr" target="_blank">alexandre.gramfort@inria.fr</a>>><br>
<br>
> wrote:<br>
<br>
> ><br>
<br>
> > no it stores the direction of the decision function to match the<br>
<br>
> API of<br>
<br>
> > linear models.<br>
<br>
> ><br>
<br>
> > HTH<br>
<br>
> > Alex<br>
<br>
> ><br>
<br>
> > On Mon, Oct 16, 2017 at 3:27 PM, Serafeim Loukas<br>
<br>
> &<a href="mailto:lt%3Bseralouk@gmail.com" target="_blank">lt;seralouk@gmail.com</a> <mailto:<a href="mailto:seralouk@gmail.com" target="_blank">seralouk@gmail.com</a>>> wrote:<br>
<br>
> >> Dear Scikit-learn community,<br>
<br>
> >><br>
<br>
> >> Since the documentation of the LDA<br>
<br>
> >><br>
<br>
> (<a href="http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html" rel="noreferrer" target="_blank">http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html</a><br>
<br>
> <<a href="http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html>" rel="noreferrer" target="_blank">http://scikit-learn.org/stable/modules/generated/sklearn.discriminant_analysis.LinearDiscriminantAnalysis.html></a>;)<br>
<br>
> >> is not so clear, I would like to ask if the lda.coef_ attribute<br>
<br>
> stores the<br>
<br>
> >> eigenvectors from the SVD decomposition.<br>
<br>
> >><br>
<br>
> >> Thank you in advance,<br>
<br>
> >> Serafeim<br>
<br>
> >><br>
<br>
> >> _______________________________________________<br>
<br>
> >> scikit-learn mailing list<br>
<br>
> >> <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
<br>
> >> <a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
> <<a href="https://mail.python.org/mailman/listinfo/scikit-learn>" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn></a>;<br>
<br>
> >><br>
<br>
> > _______________________________________________<br>
<br>
> > scikit-learn mailing list<br>
<br>
> > <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
<br>
> > <a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
> <<a href="https://mail.python.org/mailman/listinfo/scikit-learn>" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn></a>;<br>
<br>
><br>
<br>
> -------------- next part --------------<br>
<br>
> An HTML attachment was scrubbed...<br>
<br>
> URL:<br>
<br>
> <<a href="http://mail.python.org/pipermail/scikit-learn/attachments/20171016/505c7da3/attachment.html" rel="noreferrer" target="_blank">http://mail.python.org/pipermail/scikit-learn/attachments/20171016/505c7da3/attachment.html</a><br>
<br>
> <<a href="http://mail.python.org/pipermail/scikit-learn/attachments/20171016/505c7da3/attachment.html>>" rel="noreferrer" target="_blank">http://mail.python.org/pipermail/scikit-learn/attachments/20171016/505c7da3/attachment.html>></a>;<br>
<br>
><br>
<br>
> ------------------------------<br>
<br>
><br>
<br>
> Subject: Digest Footer<br>
<br>
><br>
<br>
> _______________________________________________<br>
<br>
> scikit-learn mailing list<br>
<br>
> <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a> <mailto:<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a>><br>
<br>
> <a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
> <<a href="https://mail.python.org/mailman/listinfo/scikit-learn>" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn></a>;<br>
<br>
><br>
<br>
><br>
<br>
> ------------------------------<br>
<br>
><br>
<br>
> End of scikit-learn Digest, Vol 19, Issue 25<br>
<br>
> ********************************************<br>
<br>
><br>
<br>
><br>
<br>
><br>
<br>
><br>
<br>
> _______________________________________________<br>
<br>
> scikit-learn mailing list<br>
<br>
> <a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a><br>
<br>
> <a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
<br>
<br>
-------------- next part --------------<br>
<br>
An HTML attachment was scrubbed...<br>
<br>
URL: <<a href="http://mail.python.org/pipermail/scikit-learn/attachments/20171016/f47e63a9/attachment.html>" rel="noreferrer" target="_blank">http://mail.python.org/pipermail/scikit-learn/attachments/20171016/f47e63a9/attachment.html></a>;<br>
<br>
<br>
<br>
------------------------------<br>
<br>
<br>
<br>
Subject: Digest Footer<br>
<br>
<br>
<br>
_______________________________________________<br>
<br>
scikit-learn mailing list<br>
<br>
<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a><br>
<br>
<a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
<br>
<br>
<br>
<br>
------------------------------<br>
<br>
<br>
<br>
End of scikit-learn Digest, Vol 19, Issue 28<br>
<br>
********************************************<br>
<br>
<br>
<br>
<br>
<br>
<br>
_______________________________________________<br>
<br>
scikit-learn mailing list<br>
<br>
<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a><br>
<br>
<a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
<br>
<br>
<br>
<br>
<br>
-------------- next part --------------<br>
An HTML attachment was scrubbed...<br>
URL: <<a href="http://mail.python.org/pipermail/scikit-learn/attachments/20171016/620a9401/attachment.html" rel="noreferrer" target="_blank">http://mail.python.org/pipermail/scikit-learn/attachments/20171016/620a9401/attachment.html</a>><br>
<br>
------------------------------<br>
<br>
Subject: Digest Footer<br>
<br>
_______________________________________________<br>
scikit-learn mailing list<br>
<a href="mailto:scikit-learn@python.org" target="_blank">scikit-learn@python.org</a><br>
<a href="https://mail.python.org/mailman/listinfo/scikit-learn" rel="noreferrer" target="_blank">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
<br>
<br>
------------------------------<br>
<br>
End of scikit-learn Digest, Vol 19, Issue 31<br>
********************************************<br>
</blockquote></div></div><div dir="ltr">-- <br></div><div class="gmail_signature" data-smartmail="gmail_signature"><br>Sent from a mobile phone and may contain errors</div>