Keras: how to output learning rate onto tensorboard
According to the author of Keras, the proper way is to subclass the TensorBoard
callback:
from keras import backend as K
from keras.callbacks import TensorBoard
class LRTensorBoard(TensorBoard):
# add other arguments to __init__ if you need
def __init__(self, log_dir, **kwargs):
super().__init__(log_dir=log_dir, **kwargs)
def on_epoch_end(self, epoch, logs=None):
logs = logs or {}
logs.update({'lr': K.eval(self.model.optimizer.lr)})
super().on_epoch_end(epoch, logs)
Then pass it as part of the callbacks
argument to model.fit
(credit Finncent Price):
model.fit(x=..., y=..., callbacks=[LRTensorBoard(log_dir="/tmp/tb_log")])
Note that with the current nightly version of tf (2.5 - probably earlier) learning rates using LearningRateSchedule are automatically added to tensorboard's logs. The following solution is only necessary if you're adapting the learning rate some other way - e.g. via ReduceLROnPlateau or LearningRateScheduler (different to LearningRateSchedule) callbacks.
While extending tf.keras.callbacks.TensorBoard
is a viable option, I prefer composition over subclassing.
class LearningRateLogger(tf.keras.callbacks.Callback):
def __init__(self):
super().__init__()
self._supports_tf_logs = True
def on_epoch_end(self, epoch, logs=None):
if logs is None or "learning_rate" in logs:
return
logs["learning_rate"] = self.model.optimizer.lr
This allows us to compose multiple similar callbacks, and use the logged learning rate in multiple other callbacks (e.g. if you add a CSVLogger it should also write the learning rate values to file).
Then in model.fit
model.fit(
callbacks=[
LearningRateLogger(),
# other callbacks that update `logs`
tf.keras.callbacks.TensorBoard(path),
# other callbacks that use updated logs, e.g. CSVLogger
],
**kwargs
)