Prove that $f=(x+i)^{10}+(x-i)^{10}$ have all real roots

$(x+i)^{10}+(x-i)^{10}=0$ immediately implies $|x+i|=|x-i|$, which is equivalent to $x \in \mathbb R$ (the real axis is the perpendicular bisector of the line from $i$ to $-i$).


$(x+i)^{10} = -(x-i)^{10} = i^{10}(x-i)^{10} = (i(x-i))^{10} \to \left(\dfrac{x+i}{ix+1}\right)^{10} = 1$. Can you continue??


Let $a+ib$ be an imaginary root of $f$ , where , $a,b\in \mathbb R$. Then $$f(a+ib)=0$$

$$\implies (a+i(b+1))^{10}+(a+i(b-1))^{10}=0$$

Expand two binomial series , and separate real part and imaginary parts. Put them separately $0$. Then you obtain the value of $b$, which would be zero, if the problem is proper.