Using Tensorflow Huber loss in Keras
I was looking through the losses of keras. Apparently logcosh has same properties as huber loss. More details of their similarity can be seen here.
You can wrap Tensorflow's tf.losses.huber_loss
in a custom Keras loss function and then pass it to your model.
The reason for the wrapper is that Keras will only pass y_true, y_pred
to the loss function, and you likely want to also use some of the many parameters to tf.losses.huber_loss
. So, you'll need some kind of closure like:
def get_huber_loss_fn(**huber_loss_kwargs):
def custom_huber_loss(y_true, y_pred):
return tf.losses.huber_loss(y_true, y_pred, **huber_loss_kwargs)
return custom_huber_loss
# Later...
model.compile(
loss=get_huber_loss_fn(delta=0.1)
...
)
I came here with the exact same question. The accepted answer uses logcosh
which may have similar properties, but it isn't exactly Huber Loss. Here's how I implemented Huber Loss for Keras (note that I'm using Keras from Tensorflow 1.5).
import numpy as np
import tensorflow as tf
'''
' Huber loss.
' https://jaromiru.com/2017/05/27/on-using-huber-loss-in-deep-q-learning/
' https://en.wikipedia.org/wiki/Huber_loss
'''
def huber_loss(y_true, y_pred, clip_delta=1.0):
error = y_true - y_pred
cond = tf.keras.backend.abs(error) < clip_delta
squared_loss = 0.5 * tf.keras.backend.square(error)
linear_loss = clip_delta * (tf.keras.backend.abs(error) - 0.5 * clip_delta)
return tf.where(cond, squared_loss, linear_loss)
'''
' Same as above but returns the mean loss.
'''
def huber_loss_mean(y_true, y_pred, clip_delta=1.0):
return tf.keras.backend.mean(huber_loss(y_true, y_pred, clip_delta))
Depending if you want to reduce the loss or the mean of the loss, use the corresponding function above.
How about:
loss=tf.keras.losses.Huber(delta=100.0)