Prove that the union of three subspaces of $V$ is a subspace iff one of the subspaces contains the other two.
The statement is false. Consider the following counterexample:
Consider the vector space $V=(\mathbb{Z}/2\mathbb{Z})^{2}$ where $F=\mathbb{Z}/2\mathbb{Z}$. Let $V_{1}$ be spanned by $(1,0)$. Let $V_{2}$ be spanned by $(0,1)$. Let $V_{3}$ be spanned by $(1,1)$. Then we have $V=V_{1}\cup V_{2}\cup V_{3}$, but none of the $V_{1},V_{2},V_{3}$ are subspace of another.
You can usually count on field of characteristic $2$ to give you counterexample. There are many similar counterexample, too. In finite dimension, I think all counterexamples can be constructed this way. My intuition tells me that there are infinite dimensional counterexamples of other form, but have not checked clearly.
EDIT. Here is a proof of the statement with the restriction $F\not=\mathbb{Z}/2\mathbb{Z}$:
Without loss of generality, we can assume the whole space $V$ is in fact $V_{1}+V_{2}+V_{3}$. Easily seen that in fact we must also have $V=V_{1}\cup V_{2}\cup V_{3}$.
There exist $a,b\in F$ such that $a,b\not=0$ and $a-b=1$ (take $a$ to be anything except $0,1$, and take $b=a-1$).
Assume $V_{1}$ and $V_{2}$ neither contains another (otherwise this reduce to the 2-subspace case). For any $u\in V_{1}\setminus(V_{1}\cap V_{2})$ we take an arbitrary $w\in V_{2}\setminus(V_{1}\cap V_{2})$ (it exists due to the fact that neither $V_{1}$ nor $V_{2}$ contains another). Then $au+w$ is in neither $V_{1}$ nor $V_{2}$ (if in $V_{1}$ then since $au\in V_{1}$ we must have $w\in V_{1}$ so $w\in V_{1}\cap V_{2}$ contradiction; same for the other case but now using the fact that $a\not=0$), so $u+aw\in V_{3}$. Same argument apply to show $bu+w\in V_{3}$. Hence $u=(bu+w)-(au+w)\in V_{3}$. Hence $V_{1}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Same argument apply to show $V_{2}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Now for any $v\in V_{1}\cap V_{2}$ we pick a $w\in V_{2}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Then $w+v\notin V_{1}\cap V_{2}$ (otherwise $w\in V_{1}\cap V_{2}$). But $w+v\in V_{2}$. Hence $w+v\in V_{2}\setminus(V_{1}\cap V_{2})\subset V_{3}$. Thus $v=(w+v)-w$ so $v\in V_{3}$. Hence $V_{1}\cap V_{2}\subset V_{3}$. Therefore $V_{1},V_{2}\subset V_{3}$.
Gina's answer is great, but I think we can clean it up a bit.
Let $U_1,U_2,U_3$ be subspaces of $V$ over a field $k\neq \mathbb{F}_2$.
$(\Leftarrow)$ Suppose that one of the subspaces contains the other two. Without loss of generality, assume $U_1\subset U_3$ and $U_2\subset U_3$. Then $U_1\cup U_2\cup U_3 = U_3$, and so $U_1\cup U_2\cup U_3$ is indeed a subspace of $V$.
$(\Rightarrow)$ Now suppose $U_1\cup U_2\cup U_3$ is a subspace. If $U_2$ contains $U_3$ (or conversely), let $W = U_2 \cup U_3$. Then applying the case of the union of two subspaces (you need to prove this case first) to the union $U_1\cup W$, we have that either $U_1$ contains $W$ or $W$ contains $U_1$, showing that one of the three subspaces contains the other two, as desired. So assume $U_2$ and $U_3$ are such that neither contains the other. Let \begin{equation*} x\in U_2\setminus U_3 ~~~ \text{and} ~~~ y\in U_3\setminus U_2, \end{equation*} and choose nonzero $a,b\in k$ such that $a-b = 1$ (such $a,b$ exist since we assume $k$ is not $\mathbb{F}_2$).
We claim that $ax + y$ and $bx + y$ are both in $U_1$. To see that $ax + y\in U_1$, suppose not. Then either $ax + y\in U_2$ or $ax + y\in U_3$. If $ax + y\in U_2$, then we have $(ax + y) - ax = y\in U_2$, a contradiction. And if $ax +y \in U_3$, we have $(ax + y) - y = ax \in U_3$, another contradiction, and so $ax+y\in U_1$. Similarly for $bx + y$, suppose $bx + y\in U_2$. Then $(bx + y) - bx = y \in U_2$, a contradiction. And if $bx + y\in U_3$, then $(bx + y) - y = bx \in U_3$, also a contradiction. Thus $bx + y\in U_1$ as well. Therefore \begin{equation*} (ax + y) - (bx + y) = (a-b)x = x \in U_1. \end{equation*} Now, since $x\in U_2\setminus U_3$ implies $x \in U_1$, we have $U_2\setminus U_3\subset U_1$. A similar argument shows that $x + ay$ and $x + by$ must be in $U_1$ as well, and hence \begin{equation*} (x + ay) - (x + by) = (a - b)y = y \in U_1, \end{equation*} and therefore $U_3\setminus U_2\subset U_1$. If $U_2\cap U_3=\emptyset$, we're done, so assume otherwise.
Now for any $u\in U_2\cap U_3$, choose $v \in U_3\setminus U_2\subset U_1$. Then $u+v\not\in U_2\cap U_3$, for otherwise $(u+v)-u=v\in U_2$, a contradiction. But this implies $u+v$ must be in $U_1$, and hence so is $(u+v) - v = u$. In other words, if $u\in U_2\cap U_3$, then $u\in U_1$, and hence $U_2\cap U_3\subset U_1$, as was to be shown. $\tag*{$\square$}$
This problem appears in the first chapter of Linear Algebra Done Right, by Axler. I personally think it's pretty challenging for so early in an introductory linear algebra book, but it's a great exercise. Lots of details to keep straight.
Gina gave an excellent answer,in fact,we can have : If $V$ is a vector space over the field $F$ and there is a collection of finite number of subspaces of $V$, $\{U_1,U_2,U_3,\cdots ,U_n\}$,and $n$,the number of the elements of the collection above,is not more than the cardinality of $F$,when $F$ is finite,or $F$ is just infinite,then the union of all the subspaces $U_1,U_2,U_3,\cdots ,U_n$ is a subspace of $V$ if and only if one of the subspaces $U_1,U_2,U_3,\cdots ,U_n$ contains all other subspaces.The proof is similar to the way one proves that "a vector space over an infinite field cannot be a finite union of proper subspaces of its own", and using the technique "prove by contradiction".(Using the pigeonhole principle to deduce absurdity: Imagine that the elements of $F$ "fly" into the subspaces $U_1,U_2,U_3,\cdots ,U_n$)