Why is this true for matrices? Linearly dependent columns $\implies$ not invertible

You should know that elementary column operations preserve a zero/nonzero determinant. You should also know that if a column is all zeros, then the determinant is zero. All that remains is to convince yourself that if the columns are linearly dependent, then you can make one column all zeros from a sequence of elementary column operators.


Let $A_1, \ldots, A_n$ be the columns of $A$. If they're linearly dependent, then there are constants $c_i$ (not all zero) such that $$c_1\,A_1 + \cdots + c_n \,A_n=\vec0.$$ The trick is to note that if $\vec c = \left[\begin{array}{c} c_1\\ \vdots\\ c_n\end{array}\right]$, then the above equation says precisely that $A\vec c = \vec 0$, with $\vec c \not=\vec0$. (In general, $x_1\,A_1 + \cdots + x_n \,A_n=A\vec x$, for any vector $\vec x$.)

Now you use the fact that if $A$ is invertible, then the only solution to $A\vec x = \vec0$ is $\vec x = \vec0$. So $A$ can't be invertible. $~\square$


If the columns of A are linearly dependent,

then $a_1\vec{c_1}+\cdots+a_n\vec{c_n}=\vec{0}$ for some scalars $a_1,\cdots, a_n$ (not all 0).

Then $Av=\vec{0}$ where $v=\begin{pmatrix}a_1\\\vdots\\a_n\end{pmatrix}\ne\vec{0}$, so A is not invertible

(since otherwise, multiplying by $A^{-1}$ would give a contradiction).