Matrix norm in TensorFlow

So the Frobenius norm is a sum over a nxm matrix, but tf.norm allows to process several vectors and matrices in batch.

To better understand, imagine you have a rank 3 tensor:

t = [[[2], [4], [6]], [[8], [10], [12]], [[14], [16], [18]]]

It can be seen as several matrices aligned over one direction, but the function can't figure by itself which one. It could be either a batch of the following matrices:

[2, 4, 6] , [8 ,10, 12], [14, 16, 18]

or

[2 8 14], [4, 10, 16], [6, 12, 18]

So basically axis tells which directions you want to consider when doing the summation in the Frobenius norm.

In your case, any of [1,2] or [-2,-1] would do the trick.


Independent of the number of dimensions of the tensor,

tf.sqrt(tf.reduce_sum(tf.square(w)))

should do the trick.


Negative indices are supported. Example: If you are passing a tensor that can be either a matrix or a batch of matrices at runtime, pass axis=[-2,-1] instead of axis=None to make sure that matrix norms are computed.

I just tested and [-2,-1] works.