[scikit-learn] custom loss function

federico vaggi vaggi.federico at gmail.com
Wed Sep 13 18:25:26 EDT 2017


My bad, I looked at your question in the context of your 2nd e-mail in this
topic where you talked about custom loss functions and SVR.

On Wed, 13 Sep 2017 at 15:20 Thomas Evangelidis <tevang3 at gmail.com> wrote:

> I said that I want to make a Support Vector Regressor using the rbf kernel
> to minimize my own loss function. Never mentioned about classification and
> hinge loss.
>
> On 13 September 2017 at 23:51, federico vaggi <vaggi.federico at gmail.com>
> wrote:
>
>> You are confusing the kernel with the loss function.  SVM minimize a well
>> defined hinge loss on a space that's implicitly defined by a kernel mapping
>> (or, in feature space if you use a linear kernel).
>>
>> On Wed, 13 Sep 2017 at 14:31 Thomas Evangelidis <tevang3 at gmail.com>
>> wrote:
>>
>>> What about the SVM? I use an SVR at the end to combine multiple
>>> MLPRegressor predictions using the rbf kernel (linear kernel is not good
>>> for this problem). Can I also implement an SVR with rbf kernel in
>>> Tensorflow using my own loss function? So far I found an example of an SVC
>>> with linear kernel in Tensorflow and nothing in Keras. My alternative
>>> option would be to train multiple SVRs and find through cross validation
>>> the one that minimizes my custom loss function, but as I said in a previous
>>> message, that would be a suboptimal solution because in scikit-learn the
>>> SVR minimizes the default loss function.
>>>
>>> Dne 13. 9. 2017 20:48 napsal uživatel "Andreas Mueller" <
>>> t3kcit at gmail.com>:
>>>
>>>
>>>>
>>>> On 09/13/2017 01:18 PM, Thomas Evangelidis wrote:
>>>>
>>>> ​​
>>>> Thanks again for the clarifications Sebastian!
>>>>
>>>> Keras has a Scikit-learn API with the KeraRegressor which implements
>>>> the Scikit-Learn MLPRegressor interface:
>>>>
>>>> https://keras.io/scikit-learn-api/
>>>>
>>>> Is it possible to change the loss function in KerasRegressor? I don't
>>>> have time right now to experiment with hyperparameters of new ANN
>>>> architectures. I am in urgent need to reproduce in Keras the results
>>>> obtained with MLPRegressor and the set of hyperparameters that I have
>>>> optimized for my problem and later change the loss function.
>>>>
>>>> I think using keras is probably the way to go for you.
>>>>
>>>> _______________________________________________
>>>> scikit-learn mailing list
>>>> scikit-learn at python.org
>>>> https://mail.python.org/mailman/listinfo/scikit-learn
>>>>
>>>> _______________________________________________
>>> scikit-learn mailing list
>>> scikit-learn at python.org
>>> https://mail.python.org/mailman/listinfo/scikit-learn
>>>
>>
>> _______________________________________________
>> scikit-learn mailing list
>> scikit-learn at python.org
>> https://mail.python.org/mailman/listinfo/scikit-learn
>>
>>
>
>
> --
>
> ======================================================================
>
> Dr Thomas Evangelidis
>
> Post-doctoral Researcher
> CEITEC - Central European Institute of Technology
> Masaryk University
> Kamenice 5/A35/2S049,
> 62500 Brno, Czech Republic
>
> email: tevang at pharm.uoa.gr
>
>           tevang3 at gmail.com
>
>
> website: https://sites.google.com/site/thomasevangelidishomepage/
>
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20170913/7774cc1b/attachment-0001.html>


More information about the scikit-learn mailing list