Proof that columns of an invertible matrix are linearly independent
I would say that the textbook's proof is better because it proves what needs to be proven without using facts about row-operations along the way. To see that this is the case, it may help to write out all of the definitions at work here, and all the facts that get used along the way.
Definitions:
- $A$ is invertible if there exists a matrix $A^{-1}$ such that $AA^{-1} = A^{-1}A = I$
- The vectors $v_1,\dots,v_n$ are linearly independent if the only solution to $x_1v_1 + \cdots + x_n v_n = 0$ (with $x_i \in \Bbb R$) is $x_1 = \cdots = x_n = 0$.
Textbook Proof:
Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)
Now, suppose that $A$ is invertible. We want to show that the only solution to $Ax = 0$ is $x = 0$ (and by the above fact, we'll have proven the statement).
Multiplying both sides by $A^{-1}$ gives us $$ Ax = 0 \implies A^{-1}Ax = A^{-1}0 \implies x = 0 $$ So, we may indeed state that the only $x$ with $Ax = 0$ is the vector $x = 0$.
Your Proof:
Fact: With $v_1,\dots,v_n$ referring to the columns of $A$, the equation $x_1v_1 + \cdots + x_n v_n = 0$ can be rewritten as $Ax = 0$. (This is true by definition of matrix multiplication)
Fact: If $A$ is invertible, then $A$ is row-equivalent to the identity matrix.
Fact: If $R$ is the row-reduced version of $A$, then $R$ and $A$ have the same nullspace. That is, $Rx = 0$ and $Ax = 0$ have the same solutions
From the above facts, we conclude that if $A$ is invertible, then $A$ is row-equivalent to $I$. Since the columns of $I$ are linearly independent, the columns of $A$ must be linearly independent.
It might be worth pointing out a fascinating proof in the book Algebra by Michael Artin that
The columns of a square matrix $A$ are linearly independent if and only if $A$ is invertible.
The proof proceeds by circularly proving the following chain of implications: (a) $\implies$ (b) $\implies$ (c) $\implies$ (d) $\implies$ (a). All four conditions from (a) to (d) are therefore equivalent.
(a) $A$ can be reduced to the identity by a sequence of elementary row operations.
(b) $A$ is a product of elementary matrices.
(c) $A$ is invertible.
(d) $AX = 0$ has only the trivial solution $X = 0$.