<html>
  <head>
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
  </head>
  <body text="#000000" bgcolor="#FFFFFF">
    <br>
    <br>
    <div class="moz-cite-prefix">On 6/11/19 11:47 AM, Eric J. Van der
      Velden wrote:<br>
    </div>
    <blockquote type="cite"
cite="mid:CAFxAhE-G1h7Qedawqb0ymC+qhM5utBYdX-EkWd2-uZdkxFYWFg@mail.gmail.com">
      <meta http-equiv="content-type" content="text/html; charset=UTF-8">
      <div dir="auto">Hi Nicolas, Andrew,
        <div dir="auto"><br>
        </div>
        <div dir="auto">Thanks!<br>
          <div dir="auto"><br>
          </div>
          <div dir="auto">I found out that it is the regularization
            term. Sklearn always has that term. When I program logistic
            regression with that term too, with \lambda=1, I get exactly
            the same answer as sklearn, when I look at the parameters
            you gave me.</div>
          <div dir="auto"><br>
          </div>
          <div dir="auto">Question is why sklearn always has that term
            in logistic regression. If you have enough data, do you need
            a regularization term?</div>
        </div>
      </div>
    </blockquote>
    It's equivalent to setting C to a high value.<br>
    We now allow penalty='none' in logisticregression, see <a
      href="https://github.com/scikit-learn/scikit-learn/pull/12860">https://github.com/scikit-learn/scikit-learn/pull/12860</a><br>
    <br>
    I opened an issue on improving the docs:<br>
    <a href="https://github.com/scikit-learn/scikit-learn/issues/14070">https://github.com/scikit-learn/scikit-learn/issues/14070</a><br>
    <br>
    feel free to make suggestions there.<br>
    <br>
    There's more discussion here as well:<br>
    <a href="https://github.com/scikit-learn/scikit-learn/issues/6738">https://github.com/scikit-learn/scikit-learn/issues/6738</a><br>
    <br>
    <br>
    <blockquote type="cite"
cite="mid:CAFxAhE-G1h7Qedawqb0ymC+qhM5utBYdX-EkWd2-uZdkxFYWFg@mail.gmail.com"><br>
      <div class="gmail_quote">
        <div dir="ltr" class="gmail_attr">Op di 11 jun. 2019 10:08
          schreef Andrew Howe <<a href="mailto:ahowe42@gmail.com"
            moz-do-not-send="true">ahowe42@gmail.com</a>>:<br>
        </div>
        <blockquote class="gmail_quote" style="margin:0 0 0
          .8ex;border-left:1px #ccc solid;padding-left:1ex">
          <div dir="ltr">The coef_ attribute of the LogisticRegression
            object stores the parameters.
            <div><br>
            </div>
            <div>Andrew</div>
            <div><br clear="all">
              <div>
                <div dir="ltr"
                  class="m_3891885949610842405gmail_signature"
                  data-smartmail="gmail_signature">
                  <div dir="ltr">
                    <div>
                      <div dir="ltr">
                        <div dir="ltr">
                          <div dir="ltr"><~~~~~~~~~~~~~~~~~~~~~~~~~~~><br>
                            J. Andrew Howe, PhD</div>
                          <div dir="ltr"><a
                              href="http://www.linkedin.com/in/ahowe42"
                              target="_blank" rel="noreferrer"
                              moz-do-not-send="true">LinkedIn Profile</a></div>
                          <div><a
                              href="http://www.researchgate.net/profile/John_Howe12/"
                              target="_blank" rel="noreferrer"
                              moz-do-not-send="true">ResearchGate
                              Profile</a></div>
                          <div dir="ltr"><a
                              href="http://orcid.org/0000-0002-3553-1990"
                              target="_blank" rel="noreferrer"
                              moz-do-not-send="true">Open Researcher and
                              Contributor ID (ORCID)</a></div>
                          <div dir="ltr"><a
                              href="http://github.com/ahowe42"
                              target="_blank" rel="noreferrer"
                              moz-do-not-send="true">Github Profile</a><br>
                            <div><a href="http://www.andrewhowe.com"
                                target="_blank" rel="noreferrer"
                                moz-do-not-send="true">Personal Website</a></div>
                            <div>I live to learn, so I can learn to
                              live. - me<br>
                            </div>
                            <div><~~~~~~~~~~~~~~~~~~~~~~~~~~~></div>
                          </div>
                        </div>
                      </div>
                    </div>
                  </div>
                </div>
              </div>
              <br>
            </div>
          </div>
          <br>
          <div class="gmail_quote">
            <div dir="ltr" class="gmail_attr">On Sat, Jun 8, 2019 at
              6:58 PM Eric J. Van der Velden <<a
                href="mailto:ericjvandervelden@gmail.com"
                target="_blank" rel="noreferrer" moz-do-not-send="true">ericjvandervelden@gmail.com</a>>
              wrote:<br>
            </div>
            <blockquote class="gmail_quote" style="margin:0px 0px 0px
              0.8ex;border-left:1px solid
              rgb(204,204,204);padding-left:1ex">
              <div dir="ltr">
                <div dir="ltr">
                  <div>Here I have added what I had programmed. </div>
                  <div><br>
                  </div>
                  <div>With sklearn's LogisticRegression(), how can I
                    see the parameters it has found after .fit() where
                    the cost is minimal? I use the book of Geron about
                    scikit-learn and tensorflow and on page 137 he
                    trains the model of petal widths. I did the
                    following:</div>
                  <div><br>
                  </div>
                  <div>    iris=datasets.load_iris()</div>
                  <div>    a1=iris['data'][:,3:]</div>
                  <div>    y=(iris['target']==2).astype(int)</div>
                  <div>    log_reg=LogisticRegression()</div>
                  <div>    log_reg.fit(a1,y)</div>
                  <div><br>
                  </div>
                  <div>    log_reg.coef_</div>
                  <div>    array([[2.61727777]])</div>
                  <div>    log_reg.intercept_</div>
                  <div>    array([-4.2209364])</div>
                  <div><br>
                  </div>
                  <div><br>
                  </div>
                  <div>I did the logistic regression myself with
                    Gradient Descent or Newton-Raphson as I learned from
                    my Coursera course and respectively from my book of
                    Bishop. I used the Gradient Descent method like so:</div>
                  <div><br>
                  </div>
                  <div>    from sklearn import datasets</div>
                  <div>    iris=datasets.load_iris()</div>
                  <div>    a1=iris['data'][:,3:]</div>
                  <div>    A1=np.c_[np.ones((150,1)),a1]</div>
                  <div>   
                    y=(iris['target']==2).astype(int).reshape(-1,1)</div>
                  <div>    lmda=1</div>
                  <div><br>
                  </div>
                  <div>    from scipy.special import expit</div>
                  <div><br>
                  </div>
                  <div>    def logreg_gd(w):</div>
                  <div>      z2=A1.dot(w)</div>
                  <div>      a2=expit(z2)</div>
                  <div>      delta2=a2-y</div>
                  <div>      w=w-(lmda/len(a1))*A1.T.dot(delta2)</div>
                  <div>      return w</div>
                  <div>    </div>
                  <div>    w=np.array([[0],[0]])</div>
                  <div>    for i in range(0,100000):</div>
                  <div>      w=logreg_gd(w)</div>
                  <div><br>
                  </div>
                  <div>    In [6219]: w</div>
                  <div>    Out[6219]:</div>
                  <div>    array([[-21.12563996],</div>
                  <div>           [ 12.94750716]])</div>
                  <div><br>
                  </div>
                  <div>I used Newton-Raphson like so, see Bishop page
                    207,</div>
                  <div><br>
                  </div>
                  <div>    from sklearn import datasets</div>
                  <div>    iris=datasets.load_iris()</div>
                  <div>    a1=iris['data'][:,3:]</div>
                  <div>    A1=np.c_[np.ones(len(a1)),a1]</div>
                  <div>   
                    y=(iris['target']==2).astype(int).reshape(-1,1)</div>
                  <div>    </div>
                  <div>    def logreg_nr(w):</div>
                  <div>      z1=A1.dot(w)</div>
                  <div>      y=expit(z1)</div>
                  <div>      R=np.diag((y*(1-y))[:,0])</div>
                  <div>      H=A1.T.dot(R).dot(A1)</div>
                  <div>      tmp=A1.dot(w)-np.linalg.inv(R).dot(y-t)</div>
                  <div>     
                    v=np.linalg.inv(H).dot(A1.T).dot(R).dot(tmp)</div>
                  <div>      return v</div>
                  <div><br>
                  </div>
                  <div>    w=np.array([[0],[0]])</div>
                  <div>    for i in range(0,10):</div>
                  <div>      w=logreg_nr(w)</div>
                  <div><br>
                  </div>
                  <div>    In [5149]: w</div>
                  <div>    Out[5149]:</div>
                  <div>    array([[-21.12563996],</div>
                  <div>           [ 12.94750716]])</div>
                  <div><br>
                  </div>
                  <div>Notice how much faster Newton-Raphson goes than
                    Gradient Descent. But they give the same result.</div>
                  <div><br>
                  </div>
                  <div> <span style="line-height:1.5">How can I see
                      which parameters LogisticRegression() found? And
                      should I give LogisticRegression other parameters?</span></div>
                </div>
              </div>
              <br>
              <div class="gmail_quote">
                <div dir="ltr" class="gmail_attr">On Sat, Jun 8, 2019 at
                  11:34 AM Eric J. Van der Velden <<a
                    href="mailto:ericjvandervelden@gmail.com"
                    target="_blank" rel="noreferrer"
                    moz-do-not-send="true">ericjvandervelden@gmail.com</a>>
                  wrote:<br>
                </div>
                <blockquote class="gmail_quote" style="margin:0px 0px
                  0px 0.8ex;border-left:1px solid
                  rgb(204,204,204);padding-left:1ex">
                  <div dir="ltr">
                    <div dir="ltr">
                      <div dir="ltr">Hello,
                        <div><br>
                        </div>
                        <div>I am learning sklearn from my book of
                          Geron. On page 137 he learns the model of
                          petal widths. </div>
                        <div><br>
                        </div>
                        <div>When I implements logistic regression
                          myself as I learned from my Coursera course or
                          from my book of Bishop I find that the
                          following parameters are found where the cost
                          function is minimal:</div>
                        <div><br>
                        </div>
                        <div>
                          <div>In [6219]: w</div>
                          <div>Out[6219]:</div>
                          <div>array([[-21.12563996],</div>
                          <div>       [ 12.94750716]])</div>
                        </div>
                        <div><br>
                        </div>
                        <div>I used Gradient Descent and Newton-Raphson,
                          both give the same answer.</div>
                        <div><br>
                        </div>
                        <div>My question is: how can I see after fit()
                          which parameters LogisticRegression() has
                          found?</div>
                        <div><br>
                        </div>
                        <div>One other question also: when I read the
                          documentation page, <a
href="https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression"
                            target="_blank" rel="noreferrer"
                            moz-do-not-send="true">https://scikit-learn.org/stable/modules/linear_model.html#logistic-regression</a>,
                          I see a different cost function as I read in
                          the books.</div>
                        <div><br>
                        </div>
                        <div>Thanks.</div>
                        <div><br>
                        </div>
                        <div><br>
                        </div>
                        <div><br>
                        </div>
                      </div>
                    </div>
                  </div>
                </blockquote>
              </div>
              _______________________________________________<br>
              scikit-learn mailing list<br>
              <a href="mailto:scikit-learn@python.org" target="_blank"
                rel="noreferrer" moz-do-not-send="true">scikit-learn@python.org</a><br>
              <a
                href="https://mail.python.org/mailman/listinfo/scikit-learn"
                rel="noreferrer noreferrer" target="_blank"
                moz-do-not-send="true">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
            </blockquote>
          </div>
          _______________________________________________<br>
          scikit-learn mailing list<br>
          <a href="mailto:scikit-learn@python.org" target="_blank"
            rel="noreferrer" moz-do-not-send="true">scikit-learn@python.org</a><br>
          <a
            href="https://mail.python.org/mailman/listinfo/scikit-learn"
            rel="noreferrer noreferrer" target="_blank"
            moz-do-not-send="true">https://mail.python.org/mailman/listinfo/scikit-learn</a><br>
        </blockquote>
      </div>
      <br>
      <fieldset class="mimeAttachmentHeader"></fieldset>
      <pre class="moz-quote-pre" wrap="">_______________________________________________
scikit-learn mailing list
<a class="moz-txt-link-abbreviated" href="mailto:scikit-learn@python.org">scikit-learn@python.org</a>
<a class="moz-txt-link-freetext" href="https://mail.python.org/mailman/listinfo/scikit-learn">https://mail.python.org/mailman/listinfo/scikit-learn</a>
</pre>
    </blockquote>
    <br>
  </body>
</html>