<html>
<head>
<meta content="text/html; charset=UTF-8" http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
<p>Hello Jacob,</p>
<p>Thanks a lot for your suggestions! I updated my <a
href="https://gist.github.com/kkatrio/9710b0a37f042a2df784cda7082d5523">proposal</a>.
I will add some minor details later today.<br>
</p>
<p>Regarding the codebase, I am thinking about editing
linear_model/sgd_fast.pyx for softmax and adding a new
linear_model/sgd_opt.pyx perhaps for AdaGrad and Adam, I don't
know if you agree on that. <br>
</p>
<p>I admit I am a total beginner in Cython but I have time until
June to practice.<br>
</p>
If there is real interest in the project and time to mentor my
proposal please let me know. Ideally I would prefer not to leave it
till the last day.<br>
<br>
Kind regards,<br>
Konstantinos<br>
<br>
<br>
<br>
<div class="moz-cite-prefix">On 30/03/2017 07:45 πμ, Jacob Schreiber
wrote:<br>
</div>
<blockquote
cite="mid:CA+ad8EvPn282uP2HaAvxQj_uj0E=d7czUsizvKB2HhjMBQrRqA@mail.gmail.com"
type="cite">
<div dir="ltr">Hi Konstantinos
<div><br>
</div>
<div>I likely won't be a mentor for the linear models project,
but I looked over your proposal and have a few suggestions. In
general it was a good write up!</div>
<div><br>
</div>
<div>1. You should include some equations in the write up,
basically the softmax loss (which I think is a more common
term than multinomial logistic loss) and the AdaGrad update.</div>
<div>2. You may want to indicate which files in the codebase
you'll be modifying, or if you'll be adding a new file. That
will show us you're familiar with our existing code.</div>
<div>3. You should give more time for the cython implementation
of these methods. It's not that easy to do, especially if you
don't have background experience. You can easily lose a day or
two from a dumb memory error that has nothing to do with if
you understand the equations.</div>
<div>4. You might want to also implement ADAM if time permits.
It's another optimizer that is popular. I'm not sure how
popular it is in linear models but I've seen it used
effectively, and once you get AdaGrad it should be easier to
implement a second optimizer.<br>
</div>
<div><br>
</div>
<div>Good luck!</div>
<div>Jacob</div>
</div>
<div class="gmail_extra"><br>
<div class="gmail_quote">On Mon, Mar 27, 2017 at 10:43 AM,
Konstantinos Katrioplas <span dir="ltr"><<a
moz-do-not-send="true"
href="mailto:konst.katrioplas@gmail.com" target="_blank">konst.katrioplas@gmail.com</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
<div bgcolor="#FFFFFF" text="#000000">
<p>Dear all,</p>
<p>here is a <a moz-do-not-send="true"
href="https://gist.github.com/kkatrio/9710b0a37f042a2df784cda7082d5523"
target="_blank">draft of my proposal </a>on improving
online learning for linear models with softmax and
AdaGrad.<br>
</p>
I look forward to your feedback,<br>
Konstantinos<br>
</div>
<br>
______________________________<wbr>_________________<br>
scikit-learn mailing list<br>
<a moz-do-not-send="true"
href="mailto:scikit-learn@python.org">scikit-learn@python.org</a><br>
<a moz-do-not-send="true"
href="https://mail.python.org/mailman/listinfo/scikit-learn"
rel="noreferrer" target="_blank">https://mail.python.org/<wbr>mailman/listinfo/scikit-learn</a><br>
<br>
</blockquote>
</div>
<br>
</div>
<br>
<fieldset class="mimeAttachmentHeader"></fieldset>
<br>
<pre wrap="">_______________________________________________
scikit-learn mailing list
<a class="moz-txt-link-abbreviated" href="mailto:scikit-learn@python.org">scikit-learn@python.org</a>
<a class="moz-txt-link-freetext" href="https://mail.python.org/mailman/listinfo/scikit-learn">https://mail.python.org/mailman/listinfo/scikit-learn</a>
</pre>
</blockquote>
<br>
</body>
</html>