On the functional square root of $x^2+1$
Introduce a new coordinate system with a fixed point of $f$ as origin, e.g. the point $\omega:=e^{\pi i/3}$. Writing $x=\omega+\xi$ with a new independent variable $\xi$ one has $\phi(\omega+\xi)=\omega +\psi(\xi)$ for a new unknown function $\psi$ with $\psi(0)=0$. This function $\psi$ satisfies in a neighbourhood of $\xi=0$ the functional equation $\psi(\psi(\xi))=2\omega\xi+\xi^2$. Now you can recursively determine the Taylor coefficients of $\psi$. If you are lucky the resulting formal series actually defines an analytical function in a neighbourhood of $\xi=0$.
Look also at this answer:
https://mathoverflow.net/questions/17605/how-to-solve-ffx-cosx/44727#44727
In short, the analytic solution is
$$f^{[1/2]}(x)=\phi(x)=\sum_{m=0}^{\infty} \binom {1/2}m \sum_{k=0}^m\binom mk(-1)^{m-k}f^{[k]}(x)$$
$$f^{[1/2]}(x)=\lim_{n\to\infty}\binom {1/2}n\sum_{k=0}^n\frac{1/2-n}{1/2-k}\binom nk(-1)^{n-k}f^{[k]}(x)$$
$$f^{[1/2]}(x)=\lim_{n\to\infty}\frac{\sum_{k=0}^{n} \frac{(-1)^k f^{[k]}(x)}{(1/2-k)k!(n-k)!}}{\sum_{k=0}^{n} \frac{(-1)^k }{(1/2-k) k!(n-k)!}}$$
The same way you can find not only square iterative root but iterative root of any power.
(Update: Oh sorry; I'm new here. Didn't notice the much more substantial link to the mathoverflow. But perhaps it is interesting for some newbie anyway...)
If the function is a polynomial (or powerseries) with constant term $\ne0$ this is difficult, but sometimes a meaningful solution can be given.
If the function is as above, but the constant term is zero, then it is just the question of relatively simple manipulation of the formal powerseries/polynomial to construct a "compositional" (or "iterative") root.
There is a very good deal in L.Comtet "Advanced Combinatorics" at pages around 130..150 (don't have it around).
Also the keywords "Bell-matrix", "Carleman-matrix" are helpful: these matrices transform the problem of funtion composition/iteration to matrix-products/matrix-powers. (The matrix-notation just implies the required formal powerseries-manipulations) . This is well established for functions $f(x)= ax + bx^2 + cx^3 +\cdots$.
If the constant term does not vanish, $f(x) = K + ax + bx^2 + cx^3 +\cdots$, then things are usually much more difficult. But there is one -I think: again well established- workaround:
Rewrite $f(x)$ as some function $g(x+x_0) - x_0$ such that now $g(x)$ has no constant term and then apply the above mentioned procedures (solving for iterative square root and the like) to $g(x)$.
Example: denote the iterative root of a some function $f(x)$ by $f^{[1/2]}(x)$ then solve
$$f^{[1/2]}(x) = g^{[1/2]}(x-x_0)+x_0$$
where you first must find $x_0$.
[update 2] I've tried to generate that power series for $g^{[1/2]}(x)$, which has then complex coefficients. I got $$ g^{[1/2]}(x) \approx \sqrt{2 x_0} x + (0.20412415 - 0.22379688 i) x^2 + (0.050024298 + 0.048042380 i) x^3 + (-0.022112213 + 0.028383580 i) x^4 + (-0.023623808 - 0.010393981 i) x^5 + (0.00074679924 - 0.021136421 i) x^6 + O(x^7)$$ where $f^{[1/2]}(x) = g^{[1/2]}(x-x_0) + x_0 $ and the fixpoint is $x_0 = \exp(i \pi /3 ) \approx 1/2 + 0.86602540 i $ and $\sqrt{2 x_0}=(1.2247449 + 0.70710678 i) $ . The range of convergence is small; however using Euler-summation (of complex orders) I could manage to reproduce $f(0)$ to $f(1)$ by applying two times the half-iterative to about 6 to 8 correct digits. [end update 2]
For a very basic introduction you may try my small essay at
go.helms-net.de/math/tetdocs
and look for the article "continuous functional iteration".