Find a prime that divides $14^7+14^2+1$
We can factor the polynomial $$x^7+x^2+1=(x^2+x+1)\times (x^5-x^4+x^2-x+1)$$
Letting $x=14$ shows that $211$ is a prime factor. You still have to prove it is the least prime factor, but at least you can work with a smaller number.
The reason we can get one factor immediately is that $7 \equiv 1 \pmod 3.$ As a result, both nontrivial roots of unity are roots of $x^7 + x^2 + 1,$ meaning that $(x - \omega)(x- \omega^2) = x^2 + x + 1$ must divide the polynomial $x^7 + x^2 + 1.$ Here $\omega^3 = 1$ but $\omega \neq 1$
If this seems uncomfortable, just consider that $x^2 + x + 1$ is the minimal polynomial for $\omega$ over $\mathbb Q,$ and must divide any polynomial for which $\omega$ is a root. Furthermore, the Gauss theorem on content tells us that the quotient polynomial has integer coefficients, not just rational.
Similar: the polynomial $x^{141} + x^{93} + x^{82} + x^{44} + 1$ is divisible by $x^4 + x^3 + x^2 + x + 1,$ consider a fifth root of unity.