The first non-trivial Schur functor
It looks like your construction of the Schur functor realizes $S^\lambda(V)$ as a quotient of $V^{\otimes d}$. Therefore, you should read the formulas above as identities in $S^\lambda(V)$ (alternatively, you could express $S^\lambda(V)$ as a quotient of $V^{\otimes d}$ by an appropriate subspace. For example, $S^{(3)}(V)=V^{\otimes 3}/W$, where $W$ is the subspace spanned by $\{v_1\otimes v_2\otimes v_3+v_{\sigma(1)}\otimes v_{\sigma(2)}\otimes v_{\sigma(3)}\mid v_i\in V, \sigma\in S_3\}$ which, of course, is $Sym^3(V)$).
In the case of $S^{(2,1)}(V)$ you might observe that $x_2-x_1=\sigma(x_3-x_1)$ where $\sigma=(23)$. Therefore, $$v_1\otimes v_2\otimes v_3\otimes(x_2-x_1)=v_1\otimes v_3\otimes v_2\otimes(x_3-x_1).$$ Then, your equation for $\sigma=(132)$ gives the nontrivial relation \begin{align} v_3\otimes v_1\otimes v_2\otimes(x_3-x_1)&=v_1\otimes v_2\otimes v_3\otimes(x_3-x_2)\\ &=v_1\otimes v_2\otimes v_3\otimes(x_3-x_1)-v_1\otimes v_2\otimes v_3\otimes(x_2-x_1)\\ &=v_1\otimes v_2\otimes v_3\otimes(x_3-x_1)-v_1\otimes v_3\otimes v_2\otimes(x_3-x_1)\\ &=(v_1\otimes v_2\otimes v_3-v_1\otimes v_3\otimes v_2)\otimes(x_3-x_1)\\ \end{align} which I would interpret as $$v_3\otimes v_1\otimes v_2=v_1\otimes v_2\otimes v_3-v_1\otimes v_3\otimes v_2$$ in $S^{(2,1)}(V)$. Also, note that your equation for $\sigma=(13)$ says that the tensor is anti-symmetric in the first and third components (so there is a wedge product in there).
Now, as for your question about realizing $S^{(2,1)}(V)$ as the kernel of the map $\bigwedge^2V\otimes V\to \bigwedge^3V$, note that there is an alternative definition of $S^\lambda(V)$ that realizes it as a subspace of $V^{\otimes d}$. Namely, $S^\lambda(V)=im(c_\lambda)$, where $c_\lambda\in \mathbb{C}S_d$ is the Young symmetrizer associated to $\lambda$. In the case of $\lambda=(2,1)$, $$c_\lambda=1+(12)-(13)-(321),$$ and $$c_\lambda(v_1\otimes v_2\otimes v_3)=v_1\otimes v_2\otimes v_3+v_2\otimes v_1\otimes v_3-v_3\otimes v_2\otimes v_1-v_3\otimes v_1\otimes v_2.$$ In this realization, it follows that $S^(2,1)(V)$ is the subspace of $V^{\otimes 3}$ spanned by vectors of the form $$v_1\otimes v_2\otimes v_3+v_2\otimes v_1\otimes v_3-v_3\otimes v_2\otimes v_1-v_3\otimes v_1\otimes v_2.$$ Note that these elements are anti-symmetric in the first and third components, so we can think of them as elements of $\bigwedge^2V\otimes V$ via the embedding $$(v_1\wedge v_3)\otimes v_2\mapsto v_1\otimes v_2\otimes v_3-v_3\otimes v_2\otimes v_1.$$ Under this identification, we have $$v_1\otimes v_2\otimes v_3+v_2\otimes v_1\otimes v_3-v_3\otimes v_2\otimes v_1-v_3\otimes v_1\otimes v_2=(v_1\wedge v_3)\otimes v_2+(v_2\wedge v_3)\otimes v_1.$$ It is clear that these elements are sent to $0$ under the map $\bigwedge^2V\otimes V\to\bigwedge^3V$ and span the kernel.
There is a more usable definition of the Schur functor for $\lambda \dashv n$ than the definition $S^\lambda V = V^{\otimes n} \otimes_{S_n} V^{\lambda}$. It appears on the wikipedia page for the Schur functor, although you'll find a better exposition in the book "Young Tableaux" by William Fulton.
Let's take any vector space $V$, and your chosen partition $\lambda = (2, 1)$. I'm going to write out elements of $V^{\otimes 3}$ in a grid of shape $(2, 1)$, so rather than $T = v_1 \otimes v_2 \otimes v_3$ I will write $ T = \begin{Bmatrix} v_1 & v_3 \\ v_2 \end{Bmatrix}$ (choose some ordering of the grid, and stick to it). Since this is a tensor product, it should obey the usual relations, like $$ \begin{Bmatrix} v_1 + w & v_3 \\ v_2 \end{Bmatrix} = \begin{Bmatrix} v_1 & v_3 \\ v_2 \end{Bmatrix} + \begin{Bmatrix} w & v_3 \\ v_2 \end{Bmatrix}$$ $$ \begin{Bmatrix} x v_1 & v_3 \\ v_2 \end{Bmatrix} = \begin{Bmatrix} v_1 & v_3 \\ x v_2 \end{Bmatrix} = \begin{Bmatrix} v_1 & xv_3 \\ v_2 \end{Bmatrix} = x \begin{Bmatrix} v_1 & v_3 \\ v_2 \end{Bmatrix}$$ and so on. So far I just have the space $V^{\otimes 3}$, I have not used the partition $\lambda$ yet.
Now, we impose some extra relations. We require alternating columns, which means that if the same vector appears twice in a column, the entire tensor is 0. Equivalently, whenever we switch two elements in a column, we negate the tensor. For $\lambda = (2, 1)$, this only gives us one extra relation:
$$ \text{Alternating Relation: } \begin{Bmatrix} v_1 & v_3 \\ v_2 \end{Bmatrix} = - \begin{Bmatrix} v_2 & v_3 \\ v_1 \end{Bmatrix}$$
The next relation is more tricky to state. For each pair of adjacent columns $i-1, i$ of $\lambda$, fix a subset $I$ of boxes in the right column. Now we want a tensor $T$ to be equal to the sum over all exchanges of the set $I$ with a set of equal size in column $i - 1$, where an exchange swaps the elements in the chosen subsets, preserving order. This is called the exchange condition. In the case $\lambda = (2, 1)$ we only get one relation, since there is only one pair of adjacent columns. The relation is:
$$ \text{Exchange Relation: } \begin{Bmatrix} v_1 & v_3 \\ v_2 \end{Bmatrix} = \begin{Bmatrix} v_3 & v_1 \\ v_2 \end{Bmatrix} + \begin{Bmatrix} v_1 & v_2 \\ v_3 \end{Bmatrix}$$
This exactly describes the Schur functor $S^\lambda V$. Fulton shows that this is the same as the definition $S^\lambda V = V^{\otimes n} \otimes_{S_n} V^{\lambda}$, and in my opinion it's a much better definition to use "in anger", if you just want to scribble down some elements of the thing and see where they go under certain maps and so on.
Finally, you can easily see the symmetric space $\lambda = (n)$ using this description, since there is no alternating relation, and the exchange relation just says that adjacent cells commute. Similarly, the alternating space $\lambda = (1, 1, \ldots, 1)$ comes straight out of the alternating relations, there are no exchange conditions.