Reset tensorflow Optimizer
This question also bothered me for quite a while. Actually it's quite easy, you just define an operation to reset the current state of an optimizer which can be obtained by the variables() method, something like this:
optimizer = tf.train.AdamOptimizer(0.1, name='Optimizer')
reset_optimizer_op = tf.variables_initializer(optimizer.variables())
Whenever you need to reset the optimizer, run:
sess.run(reset_optimizer_op)
Official explanation of variables():
A list of variables which encode the current state of Optimizer. Includes slot variables and additional global variables created by the optimizer in the current default graph.
e.g. for AdamOptimizer basically you will get the first and second moment(with slot_name 'm' and 'v') of all trainable variables, as long as beta1_power and beta2_power.
In tensorflow 2.x, e.g., Adam optimizer, you can reset it like this:
for var in optimizer.variables():
var.assign(tf.zeros_like(var))