Row vector vs. Column vector
In one sense, you can say that a vector is simply an object with certain properties, and it is neither a row of numbers nor a column of numbers. But in practice, we often want to use a list of $n$ numeric coordinates to describe an $n$-dimensional vector, and we call this list of coordinates a vector. The general convention seems to be that the coordinates are listed in the format known as a column vector, which is (or at least, which acts like) an $n \times 1$ matrix.
This has the nice property that if $v$ is a vector and $M$ is a matrix representing a linear transformation, the product $Mx$, computed by the usual rules of matrix multiplication, is another vector (specifically, a column vector) representing the image of $v$ under that transformation.
But because we write mostly in a horizontal direction and it is not always convenient to list the coordinates of a vector from left to right. If you're careful, you might write
$$ \langle x_1, x_2, \ldots, x_n \rangle^T $$
meaning the transpose of the row vector $\langle x_1, x_2, \ldots, x_n \rangle$; that is, we want the convenience of left-to-right notation but we make it clear that we actually mean a column vector (which is what you get when you transpose a row vector). If we're not being careful, however, we might just write $\langle x_1, x_2, \ldots, x_n \rangle$ as our "vector" and assume everyone will understand what we mean.
Occasionally we actually need the coordinates of a vector in row-vector format, in which case we can represent that by transposing a column vector. For example, if $u$ and $v$ are vectors (that is, column vectors), then the usual inner product of $u$ and $v$ can be written $u^T v$, evaluated as the product of a $1\times n$ matrix with an $n \times 1$ matrix. Note that if $u$ is a (column) vector, then $u^T$ really is a row vector and can (and should) legitimately be written as $\langle u_1, u_2, \ldots, u_n \rangle$.
This all works out quite neatly and conveniently when people are careful and precise in how they write things. At a deeper and more abstract level you can formalize these ideas as shown in another answer. (My answer here is relatively informal, intended merely to give a sense of why people think of the column vector as "the" representation of an abstract vector.)
When people are not careful and precise it may help to say to yourself sometimes that the transpose of a certain vector representation is intended in a certain context even though the person writing that representation neglected to indicate it.
For short: Column vectors live in say ${\mathbb{R}^n}$ and row vectors live in the dual of ${\mathbb{R}^n}$ which is denoted by ${\left( {{\mathbb{R}^n}} \right)^ * } \cong Hom({\mathbb{R}^n},\mathbb{R})$. Co-vectors are therefore linear mappings $\alpha :{\mathbb{R}^n} \to \mathbb{R}$. If one uses basis in ${\mathbb{R}^n}$ and basis in ${\left( {{\mathbb{R}^n}} \right)^ * }$, then for $v \in {\mathbb{R}^n}$ and $\alpha \in {\left( {{\mathbb{R}^n}} \right)^ * }$ with representations:
$$\alpha = {\sum\limits_j {{\alpha _j} \cdot \left( {{e^j}} \right)} ^ * }$$ and $$v = \sum\limits_i {{v^i} \cdot {e_i}}$$ we get: $$\alpha (v) = \alpha (\sum\limits_i {{v^i} \cdot {e_i}} ) = \sum\limits_i {{v^i} \cdot \alpha ({e_i}} )$$ $$\sum\limits_i {{v^i}\alpha ({e_i})} = \sum\limits_i {{v^i}{{\sum\limits_j {{\alpha _j} \cdot \left( {{e^j}} \right)} }^ * }({e_i})} = \sum\limits_i {{{\sum\limits_j {{\alpha _j}{v^i} \cdot \left( {{e^j}} \right)} }^ * }({e_i})}$$ $$\sum\limits_i {\sum\limits_j {{\alpha _j}{v^i} \cdot \delta _i^j} } = \sum\limits_k {{\alpha _k}{v^k}} = \left( {{\alpha _1}, \cdots ,{\alpha _n}} \right) \cdot \left( {\begin{array}{*{20}{c}} {{v^1}} \\ \vdots \\ {{v^n}} \end{array}} \right)$$ Here $\alpha$ is a row vector and $v$ a column vector. Note that $${\left( {{e^j}} \right)^ * }({e_i}) = \delta _i^j = \left\{ {\begin{array}{*{20}{c}} 1&{i = j} \\ 0&{i \ne j} \end{array}} \right.$$ is the link between a pair of dual bases. Using Einstein-Index notation (as usual) we have simply: $$\alpha (v) = {\alpha _k}{v^k}$$ This is co- and contra variant notation. Same story for ${T_p}M$ and $T_p^ * M$ that is Tangent- and Co-Tangent space for manifolds taken at a point $p \in M$. But it's another story.