Is this set of vectors linearly (in)dependent?

$\mathbb R^2$ has dimension $2$, so a set of $3$ vectors from $\mathbb R^2$ can never be linearly independent.

(In your case, the three vectors are even more dependent than they have to be, since they are all parallel).


$$0v_1+2v_2-v_3=0$$

In general, you can never have more than $k$ linearly independent vectors in a $k$-dimensional vector space


Although other answers have pointed out the fast track (more than $n$ vectors cannot be linearly independent in an $n$-dimensional space), I just wanted to help you complete your own approach, which was to solve

$$c_1 \begin{bmatrix} -1 \\ 2 \end{bmatrix} + c_2 \begin{bmatrix} 1 \\ -2 \end{bmatrix} + c_3 \begin{bmatrix} 2 \\ -4 \end{bmatrix} = 0.$$

Correct answer

As you noticed, this gives you two equations in three unknowns

$$\begin{matrix} -c_1 &{} + c_2 &{} + 2c_3 &{} = {}& 0\\ 2 c_1 &{} - 2 c_2 &{} - 4 c_3 &{} = {}& 0 \end{matrix}$$

Since you have more variables than restrictions, if you can find any solution you will get infinitely many. For example, adding the first equation to the second one twice gives

$$0 = 0.$$

Though this is true, it gives you absolutely no information. It means the second equation is just a multiple of the first, so any solution to one is also a solution to the other. All we have then, is the first equation, which I will rewrite as

$$c_1 = c_2 + 2c_3$$

If we let $c_2 = a$, $c_3 = b$, the equation tells us that any combination of $(c_1, c_2, c_3)$ of the form $(a + 2b, a, b)$ ($a, b \in \mathbb R$) solves the system. If you set $a = b = 0$, you get the trivial solution, but also $(1, 1, 0)$ or $(13, 3, 5)$ are possibilities. So you have shown that $$c_1 v_1+c_2 v_2+ c_3 v_3=0$$ does not just have the trivial solution, in fact you have shown it has infinitely many of them because any linear combination of the form $(a + 2b) v_1 + a v_2 + b v_3 = 0$, hence the vectors are not independent.

Old answer

As you noticed, this gives you two equations in three unknowns

$$\begin{matrix} -c_1 &{} + c_2 &{} + c_3 &{} = {}& 0 & (!)\\ 2 c_1 &{} - 2 c_2 &{} - 4 c_3 &{} = {}& 0 \end{matrix}$$

(Note that the first equation marked with (!) is incorrect, due to a typo in the original post it has $c_3$ instead of $2c_3$, but I think that the rest of the post shows an important technique so I'm leaving this part in for reference).

Since you have more variables than restrictions, if you can find any solution you will get infinitely many. For example, adding the first equation to the second one twice gives

$$-2c_3 = 0 \implies c_3 = 0.$$

Then substituting back into the first one, you get

$$-c_1 + c_2 = 0$$ or $$c_1 = c_2,$$

which is all the information you can squeeze out of those two equations.

This means that any combination of $(c_1, c_2, c_3)$ of the form $(a, a, 0)$ ($a \in \mathbb R$) solves the system. If you set $a = 0$, you get the trivial solution, but also $(1, 1, 0)$ is a possibility. So you have shown that $$c_1 v_1+c_2 v_2+ c_3 v_3=0$$ does not just have the trivial solution, in fact you have shown it has infinitely many of them because any linear combination of the form $a v_1 + a v_2 = 0$, hence the vectors are not independent.

The short way

In the two variations above, we have basically explored why a system with fewer equations than unknowns is underdetermined. Either two or more equations are multiples of one another, in which case you can remove the "duplicates". The $k$ independent equations in the $n$ unknowns that you have left fix $k$ of the variables, but you get an $n - k$ dimensional solution space. In the correct answer, $n = 3, k = 1$ while in the first version I accidentally had $k = 2$ but still needed one constant, any arbitrary value of which gives a solution.

Of course, you might have seen this right away, since $v_1 = -v_2$, and $v_3 = 2v_2$, and any of these is sufficient.