What is the usefulness of matrices?
I work in the field of applied math, so I will give you the point of view of an applied mathematician.
I do numerical PDEs. Basically, I take a differential equation (an equation whose solution is not a number, but a function, and that involves the function and its derivatives) and, instead of finding an analytical solution, I try to find an approximation of the value of the solution at some points (think of a grid of points). It's a bit more deep than this, but it's not the point here. The point is that eventually I find myself having to solve a linear system of equations which usually is of huge size (order of millions). It is a pretty huge number of equations to solve, I would say.
Where do matrices come into play? Well, as you know (or maybe not, I don't know) a linear system can be seen in matrix-vector form as
$$\text{A}\underline{x}=\underline{b}$$ where $\underline{x}$ contains the unknowns, A the coefficients of the equations and $\underline{b}$ contains the values of the right hand sides of the equations. For instance for the system
$$\begin{cases}2x_1+x_2=3\\4x_1-x_2=1\end{cases}$$ we have
$$\text{A}=\left[ \begin{array}{cc} 2 & 1\\ 4 & -1 \end{array} \right],\qquad \underline{x}= \left[\begin{array}{c} x_1\\ x_2 \end{array} \right]\qquad \underline{b}= \left[\begin{array}{c} 3\\ 1 \end{array} \right]$$
For what I said so far, in this context matrices look just like a fancy and compact way to write down a system of equations, mere tables of numbers.
However, in order to solve this system fast is not enough to use a calculator with a big RAM and/or a high clock rate (CPU). Of course, the more powerful the calculator is, the faster you will get the solution. But sometimes, faster might still mean days (or more) if you tackle the problem in the wrong way, even if you are on a Blue Gene.
So, to reduce the computational costs, you have to come up with a good algorithm, a smart idea. But in order to do so, you need to exploit some property or some structure of your linear system. These properties are encoded somehow in the coefficients of the matrix A. Therefore, studying matrices and their properties is of crucial importance in trying to improve linear solvers efficiency. Recognizing that the matrix enjoys a particular property might be crucial to develop a fast algorithm or even to prove that a solution exists, or that the solution has some nice property.
For instance, consider the linear system
$$\left[\begin{array}{cccc} 2 & -1 & 0 & 0\\ -1 & 2 & -1 & 0\\ 0 & -1 & 2 & -1\\ 0 & 0 & -1 & 2 \end{array} \right] \left[ \begin{array}{c} x_1\\ x_2\\ x_3\\ x_4 \end{array} \right]= \left[ \begin{array}{c} 1\\ 1\\ 1\\ 1 \end{array} \right]$$ which corresponds (in equation form) to
$$\begin{cases} 2x_1-x_2=1\\ -x_1+2x_2-x_3=1\\ -x_2+2x_3-x_4=1\\ -x_3+2x_4=1 \end{cases}$$
Just giving a quick look to the matrix, I can claim that this system has a solution and, moreover, the solution is non-negative (meaning that all the components of the solution are non-negative). I'm pretty sure you wouldn't be able to draw this conclusion just looking at the system without trying to solve it. I can also claim that to solve this system you need only 25 operations (one operation being a single addition/subtraction/division/multiplication). If you construct a larger system with the same pattern (2 on the diagonal, -1 on the upper and lower diagonal) and put a right hand side with only positive entries, I can still claim that the solution exists and it's positive and the number of operations needed to solve it is only $8n-7$, where $n$ is the size of the system.
Moreover, people already pointed out other fields where matrices are important bricks and plays an important role. I hope this thread gave you an idea of why it is worth it to study matrices. =)
Matrices are a useful way to represent, manipulate and study linear maps between finite dimensional vector spaces (if you have chosen basis).
Matrices can also represent quadratic forms (it's useful, for example, in analysis to study hessian matrices, which help us to study the behavior of critical points).
So, it's a useful tool of linear algebra.
Moreover, linear algebra is a crucial tool in math.
To convince yourself, there are a lot of linear problems you can study with little knowledge in math. For examples, system of linear equations, some error-correcting codes (linear codes), linear differential equations, linear recurrence sequences...
I also think that linear algebra is a natural framework of quantum mechanics.
Graph Theory --loosely, the study of connect-the-dot figures-- uses matrices to encode adjacency and incidence structures. More than simply bookkeeping, however, the matrices have computational uses. From powers of the adjacency matrix, for a simple example, one can read the number of available paths between any two dots.
"Spectral" Graph Theory derives graph-theoretical information from matrix-theoretical results (specifically, "eigenvalues" and "eigenvectors" --by the way, the set of eigenvalues is the "spectrum" of a matrix, hence "spectral"-- which come from the linear map interpretation of matrices). My own work generates coordinates for "symmetric" geometric realizations of graphs --think Platonic and Archimedean solids-- from this kind of analysis of their adjacency matrices.