[scikit-learn] custom loss function
vaggi.federico at gmail.com
Wed Sep 13 17:51:40 EDT 2017
You are confusing the kernel with the loss function. SVM minimize a well
defined hinge loss on a space that's implicitly defined by a kernel mapping
(or, in feature space if you use a linear kernel).
On Wed, 13 Sep 2017 at 14:31 Thomas Evangelidis <tevang3 at gmail.com> wrote:
> What about the SVM? I use an SVR at the end to combine multiple
> MLPRegressor predictions using the rbf kernel (linear kernel is not good
> for this problem). Can I also implement an SVR with rbf kernel in
> Tensorflow using my own loss function? So far I found an example of an SVC
> with linear kernel in Tensorflow and nothing in Keras. My alternative
> option would be to train multiple SVRs and find through cross validation
> the one that minimizes my custom loss function, but as I said in a previous
> message, that would be a suboptimal solution because in scikit-learn the
> SVR minimizes the default loss function.
> Dne 13. 9. 2017 20:48 napsal uživatel "Andreas Mueller" <t3kcit at gmail.com
>> On 09/13/2017 01:18 PM, Thomas Evangelidis wrote:
>> Thanks again for the clarifications Sebastian!
>> Keras has a Scikit-learn API with the KeraRegressor which implements the
>> Scikit-Learn MLPRegressor interface:
>> Is it possible to change the loss function in KerasRegressor? I don't
>> have time right now to experiment with hyperparameters of new ANN
>> architectures. I am in urgent need to reproduce in Keras the results
>> obtained with MLPRegressor and the set of hyperparameters that I have
>> optimized for my problem and later change the loss function.
>> I think using keras is probably the way to go for you.
>> scikit-learn mailing list
>> scikit-learn at python.org
> scikit-learn mailing list
> scikit-learn at python.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the scikit-learn