Asymptotic inverses of asymptotic functions

This isn't meant to be comprehensive, but here is a fairly straightforward result that is also general enough to handle your question about $\pi(x)$. I'll state it in a way that might feel a bit backwards at first — but this allows us to give a concise definition by making use of the existing concept of uniform continuity.

Let $f: \mathbb R^+ \to \mathbb R^+$ be a strictly increasing function with $\lim_{x\to\infty} f(x) = \infty$, and define the conjugate function $g(x) := \log f(\exp(x))$ (which also strictly increases to infinity). If $g^{-1}$ is uniformly continuous on some interval $[a,\infty)$, then $y \sim f(x)$ implies $x \sim f^{-1}(y)$ .

If $f(x) = x^n$, then the conjugate is just the linear function $nx$, which is of course uniformly continuous. A more interesting case is $f(x) = x /\log x$: its conjugate is $x - \log x$, whose derivative is bounded away from $0$ on $[2,\infty)$, and thus it has uniformly continuous inverse. In fact it's not hard to see that this property is preserved by multiplication and division by slowly-growing functions like $\log x$ or $\log \log x$ or even $\exp(\sqrt{\log x})$, so it applies to most of the counting functions you'll encounter in Hardy and Wright.

This result might be considered "obvious" for a fixed function $f$, so it could easily be taken for granted. But the general principle is not so hard to prove, either: assume that the conditions hold and $y \sim f(x)$, so $y / f(x) \to 1$ as $x\to\infty$. Taking logs, we have $\log y - \log f(x) \to 0$, so letting $X = \log x$ and $Y = \log y$ we get $Y - g(X) \to 0$. Now applying uniform continuity of $g^{-1}$ gives $g^{-1}(Y) - X \to 0$, which unravels to $\log f^{-1}(y) - \log x \to 0$, in other words $x \sim f^{-1}(y)$.