Is there a closed form formula for the recursive sequence: $x_n = x_{n-1} + \alpha\sqrt{x_{n-1}}$

The bonus question turns out to be easier to solve than the actual problem. Note the problem can be rewritten as

$$x_{n+1}-x_n=\alpha\sqrt{x_n}$$

Consider a variation on this:

$$\frac{dy}{dt}=\underbrace{\lim_{h\to0}{y(t+h)-y(t)\over h}}_{\approx~y(t+1)-y(t)}=\alpha\sqrt{y(t)}$$

This is easily solved using calculus and gives

$$y(t)=\left(\frac{\alpha t}2+\sqrt{y(0)}\right)^2$$

Thus, the solution to your problem is approximately given by

$$x_n\approx\left(\frac{\alpha n}2+\sqrt\beta\right)^2$$

Note that:

$$y(t+1)-y(t)=\alpha\left(\frac{\alpha t}2+\sqrt{y(0)}\right)+\frac{\alpha^2}4$$

$$\alpha\sqrt{y(t)}=\frac{\alpha^2t}2+\alpha\sqrt{y(0)}$$

That is,

$$y(t+1)-y(t)=\alpha\sqrt{y(t)}+\frac{\alpha^2}4$$

So the approximation is pretty decently close.

Thus, the inverse $z_n$ is given by

$$z_n\approx\frac2\alpha\left(\sqrt n-\sqrt\beta\right)$$


Some notes on the asymptotic behavior of the sequence in questions.

  • You are iterating the function $f(x) = x + \alpha \sqrt{x}$. For positive $\alpha$, this will obviously produce a growing sequence, and it is not hard to see that it will grow to infinity.
  • If you define $g(x) = \sqrt{x^2 + \alpha x}$, you can verify that $f(x^2) = g(x)^2$.
  • Defining $y_n = \sqrt{x_n}$, you can see that $y_n = g(y_{n-1})$.

Consider the expansion (valid for large $x$)

$$g(x) = \sqrt{x^2 + \alpha x} = x \sqrt{1 + \frac{\alpha }{x}} = x \left( 1 + \frac{\alpha}{2x} - \frac{\alpha^2}{8x^2} + o(x^{-2})\right)$$ $$= x + \frac{\alpha}{2} - \frac{\alpha^2}{8x} + o(x^{-1})$$

We can thus infer that, for large $n$

$$y_n \approx \frac{\alpha n}{2}$$

Feeding this back into the asymptotic expansion, we can show further that:

$$y_n \approx \frac{\alpha}{4}(2n - \log n)$$

Deriving extra terms in this expansion is possible, but not necessarily enlightening.

One can then infer that, for large $n$,

$$x_n \approx \frac{\alpha^2}{16}\left(2n - \log n\right)^2$$

This does not really touch on the dependence on $\beta$, which is harder to broach by studying asymptotics.


Here is a proof that $0\leq x_n \leq c (n+1)^2$ for $c = \max[\frac{\alpha^2}{4}, x_0]$, assuming $x_0>0$.

Proof (induction): Suppose $0\leq x_n \leq c(n+1)^2$ is true for some nonnegative integer $n$ (it clearly holds for $n=0$). We prove it also holds for $n+1$. Clearly $x_{n+1}\geq 0$. Then, we have \begin{align} x_{n+1} &= x_n + \alpha \sqrt{x_n} \\ &\leq c(n+1)^2 + \alpha c^{1/2}(n+1) \quad \mbox{[since it holds for $n$]}\\ &\leq (c^{1/2}(n+1) + \frac{\alpha}{2})^2 \\ &=(c^{1/2}n + c^{1/2} + \frac{\alpha}{2})^2 \\ &\overset{(a)}{\leq}(c^{1/2}n + 2c^{1/2})^2\\ &=c(n+2)^2 \end{align} where (a) holds because $c^{1/2} + \frac{\alpha}{2} \leq 2c^{1/2}$ (since $c$ was chosen to ensure $c \geq \frac{\alpha^2}{4}$). $\Box$


EDIT: Here is a bound with a coefficient $\alpha^2/4$ regardless of $x_0$ (along the lines of the discussion in the comments below).

Claim: $x_n \leq g(n+b)^2$ for all $n \in \{0, 1, 2, ...\}$, with $g=\alpha^2/4$ and $b = \frac{2\sqrt{x_0}}{\alpha}$.

Proof (induction): Suppose true for some $n \in \{0, 1, 2...\}$ (it indeed holds true for $n=0$). We prove for $n+1$. We have: \begin{align} x_{n+1} &= x_n + \alpha \sqrt{x_n} \\ &\leq g(n+b)^2 + \alpha g^{1/2}(n+b) \\ &= g(n+b)^2 + 2g(n+b) \quad \mbox{[since $\alpha = 2g^{1/2}$]}\\ &= g(n+1+b)^2 - g\\ &\leq g(n+1+b)^2 \end{align} $\Box$