Viewing a single comment thread. View all comments

Constant-Cranberry29 OP t1_iwo6vm1 wrote

initial_learning_rate = 0.02

epochs = 50

decay = initial_learning_rate / epochs

def lr_time_based_decay(epoch, lr):

return lr * 1 / (1 + decay * epoch)

history = model.fit(

x_train,

y_train,

epochs=50,

validation_split=0.2,

batch_size=64,

callbacks=[LearningRateScheduler(lr_time_based_decay, verbose=2)],

)

1

Hamster729 t1_iwo99fy wrote

That's a very odd looking time decay rule, and I'm almost certain that it does not do what you expect it to do.

Try:

def lr_time_based_decay(epoch, lr):    
   return lr*0.95  

(also see my suggestion from the edit to my previous post)

1

Constant-Cranberry29 OP t1_iwoc176 wrote

still the same even I drop abs, drop normalization, and change last layer to model.add(Dense(1, activation=None, use_bias=False)) it doesn't work

1