Submitted by Santhosh999 t3_z9dryt in deeplearning
suflaj t1_iyh2wil wrote
Well for softmax you need at least 2 neurons
Santhosh999 OP t1_iykdg48 wrote
what about loss function I need to use?
suflaj t1_iykea48 wrote
Doesn't matter. Softmax is just a multidimensional sigmoid. For binary classifications you can therefore use either 1 output and a sigmoid, or 2 outputs and a softmax. The only difference is that with a sigmoid you resolve the result as
is_fraud = result > 0.5
while with softmax you'd do
is_fraud = argmax(result) == 1
Santhosh999 OP t1_iykgddd wrote
I am getting an error when tried with 2 neurons and softmax activation func with binary crossentropy loss.
ValueError: logits and labels must have the same shape ((None, 2) vs (None, 1))
suflaj t1_iyktdbc wrote
Well, you have to change your labels from being 1 long to 2 long. If your labels are True or 1 and False or 0, you will need to change them to [0, 1] and [1, 0]
Santhosh999 OP t1_iylsz52 wrote
Thanks for clearing my doubt. It is working now.
Viewing a single comment thread. View all comments