On "complexifying" vector spaces
If you didn't get any answers to your question, it is probably in part because what you write is confused. That is not entirely your fault, because the passage in the book you are using is quite confused. In this answer I will try mainly to remove confusion.
First of all note that your "Adams" is J. Frank Adams, British mathematician (1930–1989), not Jeffrey D. Adams, a living American mathematician who also writes about Lie theory (whom I know well, I just wanted this to be clear to all).
The confusion is not just due to the unforgivable error of denoting the quaternions by Q (not $\mathbb{Q}$, since the books typography is typewriter style (this is 1969), lacking any font distinctions), but nonetheless it will be clearer to write them as $\mathbb{H}$. The author seems to be somewhat aware of subtleties for vector spaces over skew fields like $\mathbb{H}$, but he didn't really understand them. To witness, the following quotation from the previous page: "In the case $\Lambda=\mathbb{H}$, if we wish to write our matrices on the left, it will be prudent to arrange that $V$ is a right module over $\mathbb{H}$. Fortunately we can make any left module over $\mathbb{H}$ into a right module over $\mathbb{H}$, and vice versa, by the formula $qv=v\overline{q}$ ($q\in\mathbb{H}$, $v\in V$)." There is some justification to both parts of this citation, but the way they are associated makes clear the author just didn't get it. I'll explain.
Matrices represent linear maps; this means that applying them must commute with scalar multiplication of vectors (this is the part of the definition of linear maps that goes beyond just being additive). When scalars don't commute among themselves (as in $\mathbb{H}$), then their commutation with linear maps is best reflected by writing them on opposite sides of the vector argument as linear maps, so the linearity of $f$ is expressed by $f(vq)=f(v)q$. Writing $q$ on the left here (as $f(qv)=q f(v)$) would be very confusing, since it would suggest that this commutation contradicts the non-commutativity of $\mathbb{H}$: for instance in the very simple case that $V=\mathbb{H}$ and $f:V\to V$ is given by a $1\times1$ matrix $(a)$, so $f(v)=av$, multiplying $v\in\mathbb{H}$ by $q$ cannot give $qv$, since that would mean $qav=qf(v)=f(qv)=aqv$ which just isn't true; indeed multiplying $v$ by $q$ must give $vq$, so that $avq=f(vq)=f(v)q=avq$ without problem. The main point to retain is that scalar multiplication is not a linear map (except for scalars in $\mathbb{R}$), and that more generally (non-real) multiples of linear maps are not linear maps (the author does say this implicitly at some point): the map $v\mapsto f(vq)=f(v)q$ is well defined, but it is not $\mathbb{H}$-linear.
The lesson should have been that if one wants to include $\mathbb{H}$ as possible field of scalars in the discussion, then one had better write scalars on the right systematically. Instead the author uses the transfer of left-module to right-module structure argument to suggest one could simply ignore the matter and keep writing scalars on the left. However even for complex vector spaces this wreaks havoc: it's saying that officially scalar multiplication is from the right, but that multiplying on the left by $q$ is to be interpreted as multiplication on the right by $\overline{q}$. However there is only one form of scalar multiplication. If you're used to writing scalar multiplication in complex vector spaces on the left, than this contradicts what you've always been doing: now multiplication by $z$ is by definition multiplication (on the other side) by $\overline{z}$, which just is not the same thing.
If however you'd have always written complex scalars on the right of vectors, then you could in principle maintain that writing them on the left is just a weird way to ask for multiplication by their complex conjugate. And this is what the transfer of structure argument is really about: if by mischance you have a space $V$ that comes equipped with a left $\mathbb{H}$-module structure whereas you need one with a right $\mathbb{H}$-module structure, then you can build such a module by taking a copy of $V$, and define $vq$ in the new structure as $\overline{q}v$ computed in the old structure. However this does not give a space with both a left $\mathbb{H}$-module and a right $\mathbb{H}$-module structure, since such structures should always commute with one another, but scalar multiplications in a $\mathbb{H}$-module don't commute among each other. Writing $qv=v\overline{q}$ as an equation in a single space is just plain wrong: contemplate the "computation" $kv=(-i)(-j)v=(-j)vi=vij=vk=-kv$.
My answer is getting too long, but the passage you cited shows that the author supposes for a $\mathbb{H}$ vector space the simultaneous existence of a left and right module structure over $\mathbb{H}$, and there is no way to make sense of that. Maybe I'll post another answer to address what should have been said.
From my first answer it may be clear that a vector space over $\mathbb{H}$ has just a right $\mathbb{H}$-module structure. The "structure map" $j$ is then just given by multiplication by $j$, which is not a $\mathbb{H}$-linear or $\mathbb{C}$-linear map, nor $\mathbb{H}$ conjugate-linear, but it is $\mathbb{C}$ conjugate-linear: $j(v(a+bi))=v(a+bi)j=vaj+vbij=vja-vjbi=j(v)(a-bi)$ for $a,b\in\mathbb{R}$. Of course it also satisfies $j^2=-1$. Forget the rest of the first paragraph.
If one has a complex vector space $V$ equipped with a conjugate-linear map $j:V\to V$ satisfying $j^2=-1$, then one make $V$ into a (right) $\mathbb{H}$ vector space by defining $v(z_1+z_2j)=vz_1+j(vz_2)$ for $z_1,z_2\in\mathbb{C}$ (so although written on the left, the function $j$ defines multiplication on the right by $j$); the associativity of this multiplication is checked using the the description $(z_1+z_2j)(z_3+z_4j)=(z_1z_3-z_2\overline{z_4})+(z_2\overline{z_3}+z_1z_4)j$ of multiplication in $\mathbb{H}$. There is something slightly surprising here, in that we did not need to assume $V$ even-dimensional as a $\mathbb{C}$ vector space; the existence of conjugate-linear map $j:V\to V$ satisfying $j^2=-1$ forces this (and the quaternions provide the simplest proof of this fact).
If $V$ had a $\mathbb{C}$-linear action of $G$, this action carries over to the $\mathbb{H}$ vector space provided $j$ commutes with the automorphisms given by the action of $G$. Adams includes this requirement into that of being a structure map for a $G$-module. If $V$ is obtained from a $\mathbb{H}$ vector space by restricting the scalars to $\mathbb{C}$, then the structure map giving right multiplication by $j$ is precisely the additional information needed to find back the $\mathbb{H}$ vector space structure. This is not quaternionification of complex spaces (notated $q$ by Adams), which produces a space of the same dimension over $\mathbb{H}$, not half the dimension; it is just saying that with some supplementary information to the scalars-resticted space, one can find back the original space.
What is done is best described by some category theortic terminology. Restriction of scalars defines functors from $\mathbb{H}$-spaces to $\mathbb{C}$-spaces and also from $\mathbb{C}$-spaces to $\mathbb{R}$-spaces ($c'$ respectively $r$ for Adams). The notion "functor" here means that linear maps, such as those of the $G$ action, induce linear maps in the new space in a coherent way (compatible with their composition). In the opposite sense there are "extension of scalars" functors that to $V$ associate $V\otimes_{\mathbb{R}}\mathbb{C}$ (complexification of real spaces, $c$) respectively $V\otimes_{\mathbb{C}}\mathbb{H}$ (quaternionification of complex spaces, $q$). Note the opposite order w.r.t. Adams; this is essential to ensure that $q$ produces a $G$-space. These are not inverse operations to restriction of scalars (going back and forth the dimension doubles), but there is an "adjointness" that I will not go into. Another functor, from complex spaces to complex spaces, is what Adams calls $t$, keeping the space but changing scalar multiplications to their complex conjugate. (Where Adams says "natural" (p.27) he should have said functorial.)
The equations you list are to be interpreted as the existence of natural transformations; for instance $cr=1+t$ says there is a natural transformation from the functor "restriction to $\mathbb{R}$ followed to extension to $\mathbb{C}$" and the functor that associates to $V$ the space $V\oplus tV$. This means these spaces are isomorphic, in a way that is compatible (i.e., commutes) with the $\mathbb{C}$-linear maps induced by the functors, for instance those of the $G$-action.
Consider the $\mathbb{C}$-space $crV=V_{\mathbb{R}}\otimes_{\mathbb{R}}\mathbb{C}$. This is isomorphic to $V\oplus V$ (just consider dimensions), but not in a natural way. To simplify, consider just embedding $V$ into $crV$. Sending each $v\in V$ to $v\otimes1$ is not even $\mathbb{C}$-linear, since $(v\otimes1)i=v\otimes{i}\neq vi\otimes1$. One can choose a basis $B$ for $v$, map each $b\in B$ to $b\otimes1$ and extend by $\mathbb{C}$-linearity (in particuler $bi\mapsto b\otimes{i}$), which defines a $\mathbb{C}$-linear embedding, but is not natural: for instance the action of $g\in G$ sends $v\otimes z$ to $g(v)\otimes z$ and this does not commute with the defined map (think of the simple case where $g$ just multiplies by a complex number). What does work is sending $v\mapsto v\otimes1-vi\otimes{i}$: mutiplying the result by $i$ gives $v\otimes{i}-vi\otimes{i^2} =vi\otimes1-vi^2\otimes{i}$, the image of $vi$. Up to a complex scalar this is the only natural embedding of $V$ into $crV$. However one can embed $tV$ naturally in $crV$ by sending $v\mapsto v\otimes1+vi\otimes{i}$, and combining the two gives a natural transformation from $V\oplus tV$ to $crV$ that is in fact an isomorphism.
The other cases can be considered similarly, but I should warn that $qc'=2$ seems to be a serious error to me: I think there is no nonzero naturual transformation at all from the identity functor of $\mathbb{H}$ spaces to $qc'$ (recall that $qc'V=V_{\mathbb{C}}\otimes_{\mathbb{C}}\mathbb{H}$), let alone from the functor that associates $V\oplus V$ to the $\mathbb{H}$-space $V$.
Added: I found out I was mistaken in the final paragraph. There does indeed exist a natural isomorphism from $V\oplus V$ to $qc'V$, namely the one that sends $(v,w)$ to $v\otimes1-vj\otimes j+wi\otimes1+wk\otimes j$. Working with a skew field is quite subtle, and I did not realise that although the image of such a natural transformation should of course be all of $qc'V$, of quaternionic dimension 2, but the set of possibilities for such a natural transformation is only a real two dimensional space (as I argued myself, one cannot multiply linear maps by a quaternionic scalar). So there is not much variation possible for the above expression.
One more remark about "structure maps" $j$ with $j^2=1$ (still conjugate linear). Such a map does not "remember" that the $G$-space was (or could have been) obtained by restriction of scalars from $\mathbb{H}$, but rather that it was (or could have been) obtained by extension of scalars from $\mathbb{R}$: it can be obtained from complex conjugation on the right of $V=W\otimes_{\mathbb{R}}\mathbb{C}$, and permit finding back the "original" space $W$ as its fixed points.