An unpleasant measure theory/functional analysis problem
@supinf showed us a nice set of examples. I think the proof can be simplified. So suppose $1/2<p<1$ and let $f(x)=x^{-p}.$ Let $x=y/n^2$ to see
$$\tag 1 n\int^1_0 x^{-p}\sin(n^2 x)\, dx = n^{2p-1}\int_0^{n^2} y^{-p}\sin(y)\, dy.$$
If we show $\int_0^{\infty} y^{-p}\sin(y)\, dy >0,$ then it will follow that $(1)\to \infty$ on the order of $n^{2p-1}.$ But this integral equals
$$\sum_{k=0}^{\infty}\int_0^{\pi}\sin y\left(\frac{1}{(2\pi k+y)^p} - \frac{1}{(2\pi k+\pi +y)^p}\right)\,dy.$$
Each of the integrands here is positive on $(0,\pi),$ hence so is the sum, and we're done.
I think that for $\alpha\in (\tfrac12,1)$ the function $$ f(x) = \begin{cases} x^{-\alpha} &\text{if}\;x\in(0,1) \\ 0 &\text{if}\;x\not\in (0,1) \end{cases} $$ satisfies $$ \varphi_n(f)\to\infty \qquad(n\to\infty). $$ The proof requires some estimations because it is probably rather difficult to calculate the integral exactly.
Let $n\geq 3$ be fixed. For $k\geq 0$, we define $$ I_k := \int_{k\pi n^{-2}}^{(k+1)\pi n^{-2}} f(x)(n\sin(n^2 x)) \mathrm dx. $$ It is clear that $\varphi_n(f) = \sum_{k\geq 0} I_k$ and that $I_k=0$ for large $k$. It is also easy to see that $I_k\geq0$ if $k$ is even and $I_k\leq0$ if $k$ is odd.
Due to the monotonicity of $f$ it can be shown that $ I_k+I_{k+1}\geq 0 $ if $k$ is even.
Let us calculate a lower estimate for $I_0+I_1$. Again, for $k=0,1,2,3$ we define the integrals $$ J_k := \int_{k\pi n^{-2}/2}^{(k+1)\pi n^{-2}/2} f(x)(n |\sin(n^2 x)|) \mathrm dx. $$ It is clear that $I_0=J_0+J_1$ and $I_1=-J_2-J_3$. For $k=0,1,2,3$ we know that $f(x)\geq ((k+1)\pi n^{-2}/2)^{-\alpha}$. Therefore we have the lower estimate $$ J_k \geq \int_0^{\pi n^{-2}/2}((k+1)\pi n^{-2}/2)^{-\alpha} n \sin(n^2 x)\mathrm dx = \dots = ((k+1)\pi/2)^{-\alpha} n^{2\alpha-1}. $$ On the other hand, for $k=1,2,3$ we know that $f(x)\leq (k\pi n^{-2}/2)^{-\alpha}$ Therefore we have the upper estimate $$ J_k \leq \int_0^{\pi n^{-2}/2}(k\pi n^{-2}/2)^{-\alpha} n \sin(n^2 x)\mathrm dx = \dots = (k\pi/2)^{-\alpha} n^{2\alpha-1}. $$ Using these estimates it can be seen that $$ I_0-I_1 = J_0+J_1-J_2-J_3 \geq n^{2\alpha-1} ( (\pi/2)^{-\alpha} +(2\pi/2)^{-\alpha}-(2\pi/2)^{-\alpha}-(3\pi/2)^{-\alpha}) = C n^{2\alpha-1} $$ for some constant $C>0$.
To summarize, we have $$ \varphi_n(f) \geq I_0+I_1 \geq C n^{2\alpha-1} \to \infty \quad (n\to\infty). $$