[scikit-learn] MLPClassifier - Softmax activation function

Sebastian Raschka se.raschka at gmail.com
Wed Apr 18 15:26:44 EDT 2018

That's a good question since the outputs would be differently scaled if the logistic sigmoid vs the softmax is used in the output layer. I think you don't need to worry about setting anything though, since the "activation" only applies to the hidden layers, and the softmax is, regardless of "activation," automatically used in the output layer.


> On Apr 18, 2018, at 3:15 PM, Daniel Baláček <daniel.balacek at gmail.com> wrote:
> Hello everyone
> I have a question regarding MLPClassifier in sklearn. In the documentation in section 1.17. Neural network models (supervised) - 1.17.2 Classification it is stated that  "MLPClassifier supports multi-class classification by applying Softmax as the output function."
> However it is not clear how to apply the Softmax function.
> The way I think (or hope) this works is that if a parameter activation is set to activation = 'logistic' Softmax function should be automatically applied whenever there are more than two classes. Is this right or does one have to explicitly specify the use of Softmax function in some way?
> I am sorry if this is a nonsense question. I am new to scikit-learn and machine learning in general and I was not sure about this one. Thank you for any answers in advance.
> With regards,
> D. B.
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn

More information about the scikit-learn mailing list