Home > ailao, software > Keras for Binary Classification

Keras for Binary Classification

January 13th, 2016 Leave a comment Go to comments

So I didn’t get around to seriously (besides running a few examples) play with Keras (a powerful library for building fully-differentiable machine learning models aka neural networks) – until now. And I have been a bit surprised about how tricky it actually was for me to get a simple task running, despite (or maybe because of) all the docs available already.

The thing is, many of the “basic examples” gloss over exactly how the inputs and mainly outputs look like, and that’s important. Especially since for me, the archetypal simplest machine learning problem consists of binary classification, but in Keras the canonical task is categorical classification. Only after fumbling around for a few hours, I have realized this fundamental rift.

The examples (besides LSTM sequence classification) silently assume that you want to classify to categories (e.g. to predict words etc.), not do a binary 1/0 classification. The consequences are that if you naively copy the example MLP at first, before learning to think about it, your model will never learn anything and to add insult to injury, always show the accuracy as 1.0.

So, there are a few important things you need to do to perform binary classification:

  • Pass output_dim=1 to your final Dense layer (this is the obvious one).
  • Use sigmoid activation instead of softmax – obviously, softmax on single output will always normalize whatever comes in to 1.0.
  • Pass class_mode='binary' to model.compile() (this fixes the accuracy display, possibly more; you want to pass show_accuracy=True to model.fit()).

Other lessons learned:

  • For some projects, my approach of first cobbling up an example from existing code and then thinking harder about it works great; for others, not so much…
  • In IPython, do not forget to reinitialize model = Sequential() in some of your cells – a lot of confusion ensues otherwise.
  • Keras is pretty awesome and powerful. Conceptually, I think I like NNBlocks‘ usage philosophy more (regarding how you build the model), but sadly that library is still very early in its inception (I have created a bunch of gh issues).

(Edit: After a few hours, I toned down this post a bit. It wasn’t meant at all to be an attack at Keras, though it might be perceived by someone as such. Just as a word of caution to fellow Keras newbies. And it shouldn’t take much to improve the Keras docs.)

Categories: ailao, software Tags: , , ,
  1. Gianni
    March 24th, 2016 at 13:17 | #1

    Hi, thanks for your suggestion, I followed it but I got nothing’s good. My NN learns but when I apply the trained model on the same data used for train (or on the test dataset for what it matter) I got all 1s or 0s depending on the model I built.
    Here’s my last attempt:
    https://www.kaggle.com/c/bnp-paribas-cardif-claims-management/forums/t/19124/anyone-tried-neural-networks-and-deep-learning/112841#post112841
    Can someone give me an hint ?

  2. Michael
    July 3rd, 2016 at 18:33 | #2

    Thanks for this. When I changed softmax to sigmoid, everything worked.

  3. Walid Ahmed
    September 23rd, 2016 at 19:18 | #3

    Thanks man
    worked like a charm when I changed to Sigmoid

  4. Pavel
    February 27th, 2017 at 12:14 | #4

    Thank you for sharing this post, I was having a very similar issue!

    For the sake of future readers: if you are using Keras 1.2 read `class_mode=’binary’` as `loss=’binary_crossentropy’`.

  5. james
    February 3rd, 2018 at 11:57 | #5

    @Pavel
    thanks

  1. No trackbacks yet.


1 × = five