[scikit-learn] GSoC proposal - linear model

Jacob Schreiber jmschreiber91 at gmail.com
Fri Mar 31 19:19:26 EDT 2017


Hi Konstantinos

Thanks for the changes. You should go ahead and submit if you're happy with
the proposal, it's unlikely that the decision will come down to details.

Jacob



On Fri, Mar 31, 2017 at 12:44 AM Konstantinos Katrioplas <
konst.katrioplas at gmail.com> wrote:

> Hello Jacob,
>
> Thanks a lot for your suggestions! I updated my proposal
> <https://gist.github.com/kkatrio/9710b0a37f042a2df784cda7082d5523>. I
> will add some minor details later today.
>
> Regarding the codebase, I am thinking about editing
> linear_model/sgd_fast.pyx for softmax and adding a new
> linear_model/sgd_opt.pyx perhaps for AdaGrad and Adam, I don't know if you
> agree on that.
>
> I admit I am a total beginner in Cython but I have time until June to
> practice.
> If there is real interest in the project and time to mentor my proposal
> please let me know. Ideally I would prefer not to leave it till the last
> day.
>
> Kind regards,
> Konstantinos
>
>
>
>
> On 30/03/2017 07:45 πμ, Jacob Schreiber wrote:
>
> Hi Konstantinos
>
> I likely won't be a mentor for the linear models project, but I looked
> over your proposal and have a few suggestions. In general it was a good
> write up!
>
> 1. You should include some equations in the write up, basically the
> softmax loss (which I think is a more common term than multinomial logistic
> loss) and the AdaGrad update.
> 2. You may want to indicate which files in the codebase you'll be
> modifying, or if you'll be adding a new file. That will show us you're
> familiar with our existing code.
> 3. You should give more time for the cython implementation of these
> methods. It's not that easy to do, especially if you don't have background
> experience. You can easily lose a day or two from a dumb memory error that
> has nothing to do with if you understand the equations.
> 4. You might want to also implement ADAM if time permits. It's another
> optimizer that is popular. I'm not sure how popular it is in linear models
> but I've seen it used effectively, and once you get AdaGrad it should be
> easier to implement a second optimizer.
>
> Good luck!
> Jacob
>
> On Mon, Mar 27, 2017 at 10:43 AM, Konstantinos Katrioplas <
> konst.katrioplas at gmail.com> wrote:
>
> Dear all,
>
> here is a draft of my proposal
> <https://gist.github.com/kkatrio/9710b0a37f042a2df784cda7082d5523>on
> improving online learning for linear models with softmax and AdaGrad.
> I look forward to your feedback,
> Konstantinos
>
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
>
>
>
> _______________________________________________
> scikit-learn mailing listscikit-learn at python.orghttps://mail.python.org/mailman/listinfo/scikit-learn
>
>
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20170331/cf1cc354/attachment.html>


More information about the scikit-learn mailing list