What is the vector corresponding to a two-particle product state $|\psi\rangle_1|\psi\rangle_2$?
Your prescription is incorrect. The state you've written down is the (external) direct sum of the two separate vector spaces, but independent quantum particles are described by states in the tensor product of their individual state spaces.
Thus, if you write each of the states as $|\psi_1\rangle = a|0\rangle + b|1\rangle$ and $|\psi_2\rangle = \alpha|0\rangle + \beta|1\rangle$, then the joint state of the two particles is the tensor product \begin{align} |\psi\rangle & = |\psi_1\rangle \otimes |\psi_2\rangle \\ & = \left( a|0\rangle + b|1\rangle \right) \otimes \left( \alpha|0\rangle + \beta|1\rangle \right) \\ & = a\alpha |00\rangle + a\beta |01\rangle + b\alpha |10\rangle + b\beta |11\rangle. \end{align} There is no canonical way to write down this vector as a column vector, or, more specifically, there is some give-and-take in how you order the states of the product basis $\{|00\rangle , |01\rangle, |10\rangle ,|11\rangle\}$. That ordering is common in quantum information, where it's known as the 'computational basis', but other orderings are possible. If you do take that ordering, then $|\psi\rangle$ corresponds to the column vector $$ |\psi\rangle \leftrightarrow \begin{pmatrix} a \\ b \end{pmatrix} \otimes \begin{pmatrix} \alpha \\ \beta \end{pmatrix} = \begin{pmatrix} a\alpha \\ a\beta \\ b\alpha \\ b\beta \end{pmatrix}. $$ Inner products in the tensor-product space work much as you'd expect, i.e. via $$ \left< \vphantom{\sum} |u\rangle\otimes |v\rangle, |w\rangle\otimes |x\rangle \right> = \langle u|w\rangle \langle v|x\rangle, $$ which means that the computational basis is orthonormal. Among other things, this means that the tensor product of two unit-norm states also has unit norm: \begin{align} \langle \psi|\psi \rangle & = |a\alpha|^2 + |a\beta|^2 + |b\alpha|^2 + |b\beta|^2 \\ & = \left( |a|^2+|b|^2\right)\left( |\alpha|^2+|\beta|^2\right) \\ & = 1\times 1 = 1. \end{align}
The tensor product of two 2-entry vectors is not simply a 4-entry vector and does not behave like it, e.g. when you think about things as simple addition of two such objects. For example,
$$\left( \begin{array}{c} a \\ b \end{array} \right) \otimes \left( \begin{array}{c} c \\ d \end{array} \right) + \left( \begin{array}{c} a \\ b \end{array} \right) \otimes \left( \begin{array}{c} e \\ f \end{array} \right) = \left( \begin{array}{c} a \\ b \end{array} \right) \otimes \left( \begin{array}{c} c+e \\ d+f \end{array} \right) \neq \left( \begin{array}{c} a+a \\ b+b \end{array} \right) \otimes \left( \begin{array}{c} c+e \\ d+f \end{array} \right).$$
This gets even worse with the norm or things like this since the right norm is the product of the norms of the two vectors:
$$ \left\| \left( \begin{array}{c} a \\ b \end{array} \right) \otimes \left( \begin{array}{c} c \\ d \end{array} \right) \right\| = \left\| \left( \begin{array}{c} a \\ b \end{array} \right) \right\|_1 \cdot \left\| \left( \begin{array}{c} c \\ d \end{array} \right) \right\|_2. $$
There is actually a way of rewriting a tensor product of two vectors as one vector, but in a different way than you did. The tensor product of our spaces is four-dimensional, with a basis e.g. given as
$$ \left( \begin{array}{c} 1 \\ 0 \end{array} \right) \otimes \left( \begin{array}{c} 1 \\ 0 \end{array} \right), \left( \begin{array}{c} 1 \\ 0 \end{array} \right) \otimes \left( \begin{array}{c} 0 \\ 1 \end{array} \right), \left( \begin{array}{c} 0 \\ 1 \end{array} \right) \otimes \left( \begin{array}{c} 1 \\ 0 \end{array} \right), \left( \begin{array}{c} 0 \\ 1 \end{array} \right) \otimes \left( \begin{array}{c} 0 \\ 1 \end{array} \right) .$$
In this basis, the four-entry vector from your question correctly reads
$$ \left( \begin{array}{c} a \alpha \\ a \beta \\ b \alpha \\ b \beta \end{array} \right) $$
and now you can compute the norm the usual way:
$$ \sqrt{ a^2 \alpha^2 + a^2 \beta^2 + b^2 \alpha^2 + b^2 \beta^2 } = 1.$$