Viewing a single comment thread. View all comments

mikef0x OP t1_iv0d8up wrote

model = tf.keras.models.Sequential()model.add(tf.keras.layers.Conv2D(64, (3, 3), input_shape=(124, 124, 3)))model.add(tf.keras.layers.MaxPooling2D((2, 2)))model.add(tf.keras.layers.Conv2D(32, (3, 3)))model.add(tf.keras.layers.MaxPooling2D((2, 2)))model.add(tf.keras.layers.Conv2D(8, (3, 3)))model.add(tf.keras.layers.Flatten())model.add(tf.keras.layers.Dense(4, activation='softmax'))

so, i've done like this. loss is low now but accuracy is to high :D i mean on epoch 10 its around 0.99.

​

update: on 20 epoch: accuracy is 1

2

BlazeObsidian t1_iv0hlz1 wrote

Haha. It might be overfitting now. How does it perform on the test set ? If the accuracy is > 90 on test set I would think it's a good model.

How does it perform on the test set ? If accuracy is bad on test data, you would have to reduce the Conv layers and see if you can get more data.

Can you post the train vs test loss and accuracy here ?

1