How to find a matrix $X$ such that $X+X^2+X^3 = \begin{bmatrix} 1&2005\\ 2006&1 \end{bmatrix}$?

There's no such $X$, even with rational entries. If there were, then it would have an eigenvalue that's either rational or a quadratic irrationality. But if $\lambda$ is an eigenvalue of $X$ then $\lambda + \lambda^2 + \lambda^3$ is an eigenvalue of $\left[\begin{array}{cc}1&2005\cr2006&1\end{array}\right]$. But those eigenvalues are the roots $x = 1 \pm \sqrt{2005\cdot 2006}$ of $(x-1)^2 = 2005 \cdot 2006$, and the polynomial $(\lambda^3+\lambda^2+\lambda-1)^2 - 2005 \cdot 2006$ turns out to be irreducible, so none of its roots can be the eigenvalue of a $2 \times 2$ matrix with rational entries, QED.


If an integer matrix solution exists, then with modulo 2 arithmetic and with $X=Y+I$, the equation $$ (Y+I)+(Y^2+I)+(Y^3+Y^2+Y+I) = \pmatrix{1&1\\ 0&1} $$ is solvable over $GF(2)$, meaning that $$ Y^3 = \pmatrix{0&1\\ 0&0}. $$ But this is impossible because the square of every $2\times2$ nilpotent matrix must vanish. Therefore the matrix equation in the OP's question has no integer matrix solution.