What are some usual norms for matrices?

First of all, yes: the matrices form some sort of vector space. You can add any two matrices, and you can multiply matrices by a number, and you'll always get another matrix. In a sense, that's all you need for a set to be a vector space. Matrices have a little bit more structure too: for one, you can multiply two matrices together (which you can't generally do with vectors). Moreover, matrices are really linear maps. I'll get back to those in a minute.

There are three kinds of matrix norms, each of which is useful in different circumstances.


Norms ("just" a norm):

Sometimes a norm is just a norm. Often, it's useful to think of a matrix as "a box of numbers" in the same way that you would think of a vector in $\Bbb R^n$ as a "list of numbers". A "matrix norm" by this definition is any function on the matrices that satisfies the usual rules that define a norm. In particular, for any matrices $A,B \in \Bbb R^{n \times m}$ and constant $\alpha$, we need to have

  1. $\|A\| \geq 0$, with $\|A\| = 0 \iff A = 0$
  2. $\|\alpha A\| = |\alpha|\|A\|$
  3. $\|A + B\| \leq \|A\| + \|B\|$

You would use these norms any time you would use an ordinary norm. One reason we would need this kind of norm is to show that a function involving matrices is "continuous", or "differentiable". The usual example of this kind of norm is the "entrywise $p$-norm", which is given by $$ \|A\| = \left(\sum_{i=1}^n \sum_{j=1}^m |a_{ij}|^p\right)^{1/p} $$ for $1 \leq p \leq \infty$.

Every matrix norm can be thought of in this way, i.e. as a "general norm". However, sometimes we want our matrix norm to have a bit more structure.


Submultiplicative Norms (AKA "matrix norms")

We say that a matrix norm $\|\cdot\|$ is submultiplicative if, in addition to being a norm, it also satisfies the inequality $$ \|AB\| \leq \|A\| \cdot \|B\| $$ For any square matrices $A,B$ of the same size

A lot of times, your everyday norm just won't cut it. For those occasions, submultiplicative norms tend to come in handy. These are useful for dealing with "polynomials" on matrices since we have inequalities like $$ \|f(A)\| = \left\|\sum_{k}a_kA^k \right\| \leq \sum_k |a_k|\|A\|^k $$ Notably, if the $a_k$ are non-negative, $\|f(A)\| \leq f(\|A\|)$, so that we have $\|e^A\| \leq e^{\|A\|}$ for instance.

Multiplicative norms are also very useful for spectral (eigenvalue) analysis. In fact, we have some theorems involving $\rho(A)$, the spectral radius of $A$, and any submultiplicative norm:

  • $\|A\| \geq \rho(A)$
  • $\rho(A) = \lim_{k \to \infty} \|A^k\|^{1/k}$
  • $\rho(A) = \inf_{\|\cdot\| \text{ is submult.}} \|A\|$

The classic example of a submultiplicative norm is the Frobenius norm, AKA the entrywise $2$-norm, AKA the Schatten $2$-norm: $$ \|A\|_F = \sqrt{\sum_{i = 1}^n\sum_{j=1}^m |a_{ij}|^2} $$ This is probably the most commonly used of all matrix norms. It is particularly useful since it is the norm derived from the Frobenius inner product (AKA Hilbert-Schmidt inner product). That is, it turns out that taking the "dot product" of matrices is a useful thing to do, and the Frobenius norm is the norm that results from this dot product.

The Schatten norms (and other unitarily invariant norms) are also submultiplicative, and get a fair bit of use. The entrywise $p$-norms from earlier only happen to be submultiplicative norms when $1 \leq p \leq 2$; these are easy to compute, but tend not give tight bounds.

Finally, we might want our norms to be nicer still.


Operator Norms (AKA "induced/derived norms")

Suppose $\|\cdot \|$ is a vector norm on $\Bbb R^n$. We define the corresponding operator norm on $\Bbb R^{m \times n}$ to be given by $$ \|A\| = \sup_{\|x\| \leq 1} \|Ax\| $$

Every operator norm is a submultiplicative norm. However, not every submultiplicative norm is an operator norm. Besides doing everything that the submultiplicative norms can do, operator norms are useful when you're thinking about how matrices act on vectors. In particular, with operator norms, we have the inequality $$ \|Av\| \leq \|A\|\cdot \|v\| $$ It follows that for every operator norm, the identity matrix $I$ has the property $\|I\| = 1$. This fact turns out to have some useful consequences (e.g. inequalities involving the norm of a matrix's inverse).

Most of the norms that we have mentioned are not operator norms. The operator norm that you see most often is the one derived from the Euclidean norm ($2$-norm) on vectors. In particular, we have $$ \|A\|_2 = \sup_{\|x\| \leq 1} \|Ax\|_2 = \sigma_1(A) $$ That is, this norm is equal to the largest singular value of $A$. This norm also happens to coincide with the "Schatten $\infty$-norm", one of the Schatten-norms discussed above.

A particularly useful property of this norm is that $\|A\|_2 = \rho(A)$ whenever $A$ happens to be normal (i.e. whenever $A^TA = AA^T$). Because of this property, $\|\cdot\|_2$ is sometimes called the "spectral norm".

Two other operator norms that are commonly used (especially in the context of numerical linear algebra) are the one derived from the $1$-norm ("taxicab norm") and the one derived from the $\infty$-norm ("max norm"). These are straightforward to compute; in particular, we have $$ \|A\|_1= \max_j \sum_{i=1}^m |A_{ij}|\\ \|A\|_{\infty}= \max_i \sum_{j=1}^n |A_{ij}| $$