reduce lr on plateau with adam code example
Example 1: reduce lr on plateau with adam
def step_decay(epoch): initial_lrate = 0.1 drop = 0.5 epochs_drop = 10.0 lrate = initial_lrate * math.pow(drop, math.floor((1+epoch)/epochs_drop)) return lratelrate = LearningRateScheduler(step_decay)
Example 2: reduce lr on plateau with adam
lr = lr0 * drop^floor(epoch / epochs_drop)