Fast evaluation of polynomials
If the polynomial is given as $\alpha_0x^0+\dots+\alpha_nx^n$ and you do not know a priori anything about the $\alpha_i$’s, then you can’t do better than Horner’s scheme (which takes $n$ additions and multiplications). If you know that the polynomial is sparse and you are given a list of nonzero coefficients, you can evaluate the individual terms using repeated squaring (this takes about $k$ additions and $O(k\log n)$ multiplications, where $k$ is the number of nonzero terms). Other information about the polynomial may also help in principle, such as some sort of symmetries in the coefficient list.
The simplest version of this question is: what is the quickest way to evaluate $x^n?$ For $n = 2^k,$ $k$ repeated squarings is obviously best, but for more complicated $n$ I believe that finding the optimum is very hard -- see Knuth, vol 2 for (much) more on these so-called "multiplication trees".
The cheapest way of finding the value of a polynomial, given unlimited preprocessing resources, is to look up the precalculated value in the table. However, if you know you are going to need several more values evaluated at successive intervals, you might try a method similar to that desired by Charles Babbage: differences. Namely, store the value and the the n kth order differences (similar to evaluations at derivatives) for point x, and then use n additions to derive the differences and value for the polynomial at the point x+1. If you need to loop through to evaluate the polynomial at successive integers, this gets those values with O(n) additions per evaluation point.
(Of course needing random or real access to the polynomial will require something different, but you might find storing values at derivatives useful for evaluating the polynomial at near by points, especially if multiplication is expensive..)
Gerhard "Email Me About System Design" Paseman, 2011.07.04