Solving for $a$ in this equation $\sin^a(a) = b$

In the same spirit as amcalde's answer, and staying in the real domain which implies $a\in[2k \pi,(2k+1)\pi]$ (this as been pointed out by Quang Hoang in a comment) and $b\in[0,1]$.

Limiting the analysis to the case of $k=0$, considering $f(x)= \sin^x(x) $, you could easily notice that, starting at $x=0$ $(f(0)=1)$, it decreases and reaches a minimum value close to $x_1\approx 0.398576$ $(f_1=f(x_1)\approx 0.68575)$, starts increasing up to $x_2=\frac \pi 2$ $(f(x_2)=1)$ and then decreases to $0$ for $x_3=\pi$ $(f(x_3)=0)$.

So, depending on the value of $b$, the equation $$\sin^x(x)=b$$ can have

  • two solutions if $b=1$ (namely $0$ and $x_2$)
  • three solutions if $f_1<b<1$
  • two solutions if $b=f_1$
  • one solution if $b<f_1$

As said, only numerical methods (such as Newton) will solve the problem but the question is to have a reasonable starting value.

Just as amcalde answered, look at the plot, make a guess $x_0$ for the root(s) you want and start Newton iterations $$x_{n+1}=x_n-\frac{\sin ^{x_n}(x_n)-b}{\sin ^{x_n}(x_n) (x_n \cot (x_n)+\log (\sin (x_n)))}$$ to the required accuracy.

For illustration purposes, consider $b=0.8$. The plot shows roots close to $0.1$, $0.9$, and $2.0$.

Applying the method for the first root, the iterates will be $$x_1=0.0944109$$ $$x_2=0.0945459$$ $$x_3=0.0945460$$ which is the solution for six significant figures.

Applying the method for the second root, the iterates will be $$x_1=0.892865$$ $$x_2=0.892846$$

Applying the method for the third root, the iterates will be $$x_1=2.03211$$ $$x_2=2.03102$$

As you see, the solutions are obtained quite fast.

Edit

After Quang Hoang's good point, I had a look at the case $k>0$ which is quite interesting. For any value of $b\in[0,1]$, it seems that there are always two roots on each side of $(4k+1)\frac \pi 2$. The curves are sharper and sharper picks (they look like gaussian curves in spite of some skewness that numerical calculations can easily show - at mid height, corresponding to $b=0.5$, the distance between the two roots is more or less $2 \sqrt{\frac{\log (2)}{1+k\pi}}$ ) and the problem becomes simpler than for the case $k=0$ since, for $x\in[2k \pi,(2k+1)\pi]$, we can approximate very accurately $$\sin^x(x)\approx \exp \left(-(1+k\pi) \left(x-(4k+1)\frac{\pi}{2} \right)^2\right)$$from which quite good estimates can be generated for each of the two roots. $$x_{\pm}=(4k+1)\frac{\pi}{2}\pm \sqrt{\frac{-\log(b)}{1+k\pi}}$$ For example, still using $b=0.8$ and $k=1$ the above approximation gives as estimates $x_-=7.62186$, $x_+=8.08610$ while the solutions are $7.61304$ and $8.08781$. To some extent, the formula also applies for the estimates of the largest solution when $k=0$ (for the case looked at, it gives as estimates $1.09842$ and $2.04318$).

On the other side, the cases of $k<0$ seem to be nightmares (it does not seem that solutions exist - According to Quang Hoang's comment, this is wrong : there are effectively two solutions if $b>1$).

Edit

For the case of $k=0$ and $0.68575<b<1$, the initial (very crude) estimates can be obtained

  • For the first root, $$x=x_1 \left(1-\sqrt{\frac{b-f_1}{1-f_1}}\right)$$ This was obtained after fitting the function by a quadratic polynomial going through points $(0,1)$, $(x_1,f_1)$ with an horizontal tangent at $(x_1,f_1)$.

  • For the second root,$$x=\frac{\pi}{2}- \sqrt{-\log(b)}$$

  • For the third root,$$x=\frac{\pi}{2}+ \sqrt{-\log(b)}$$

For the case $b=0.8$, this wil generate the estimates $0.16$, $1.10$ and $2.04$ which seem to be quite reasonable; starting withe these values, Newton method converges in very few iterations.

For the case where $k=0$ and $0<b<f_1$ the estimate of the solution is $$x=\pi -(\pi -x_4) \sqrt{\frac{b}{f_1}}$$ This has been obtained after approximating the function by a quadratic polynomial going through points $( x_4,f_1)$, $( \pi ,0)$ with horizontal tangents at the end point ($x_4 \approx 2.14647$ corresponds to the largest solution of $\sin^x(x)=f_1$).

Let us try Newton method for $b=0.1$. The successive iterates are then $$x_0=2.76158$$ $$x_1=2.69242$$ $$x_2=2.70099$$ $$x_3=2.70112$$

Edit

The case to also consider is if $b$ is a very small number. The approximation given from the tails of the approximating gaussian curves is not very good. In such a case, what is prosed is to use as starting values

  • for the smallest root, $x=2k\pi+b^{\frac 1{2k\pi}}$
  • for the largest root, $x=(2k+1)\pi-b^{\frac 1{2k\pi}}$

For illustration purposes, consider $k=1$ and $b=10^{-4}$. The estimates are then $6.51406$ and $9.19390$ while the solutions are $6.39888$ and $9.20013$.

Edit

When we generate an estimate $x_0$, it must be such that $x_0\in[2k \pi,(2k+1)\pi]$. With the gaussian approximation, the value of the function is not zero at the bounds. So, with a minor sacrifice for the value of the maximum of the curve, I suggest to use instead $$\sin^x(x)\approx \exp \left(-(1+k\pi) \left(x-(4k+1)\frac{\pi}{2} \right)^2\right)-\exp \left({-\frac{1}{4} \pi ^2 (1+k\pi )}\right)$$ which will give as estimates $$x_\pm= \left(2 k+\frac{1}{2}\right)\pi\pm\sqrt{-\frac{\log \left(b+e^{-\frac{1}{4} \pi ^2 (1+k\pi)}\right)}{1+k\pi }}$$ which also works fine for small values of $b$.

Finally, in order to introduce some skewness in the gaussian, the function is defined independently for left and right sides according to $$\sin^x(x)\approx \exp \left(-\alpha \left(x-(4k+1)\frac{\pi}{2} \right)^2\right)-\exp \left({-\frac{\alpha}{4} \pi ^2 }\right)$$ using $\alpha=\frac 34+k\pi$ for the left side and $\alpha=\frac 54+k\pi$ for the right side of the curve.

For illustration purposes, consider again $k=1$ and $b=10^{-4}$. The estimates are then $6.35931$ and $9.19390$ while the solutions are $6.39888$ and $9.28798$.


Looking at the plot for this function you probably want $b$ to be in the range $[0,1]$ and $a\in[0,\pi]$. The best you can do in the general case is to use Newton's method. Set $$f(x) = \sin^x(x) - b,$$then $$f'(x) = \sin^x(x)(\cot(x) + \log(\sin(x)) )$$

If, given $b$, I had to numerically approximate this, I would

1) Graph $f(x)$ and look for a zero.

2) Select a starting input to Newton's method that is close to your zero.

3) Iterate until you have desired convergence keeping track so your solution does not diverge. But it looks like a nice enough function that you probably needn't worry in this range.