Prove that entropy for distribution decreases under certain operation
Consider the transformation $p_i\to\frac{p_ia_i^x}{\sum_ip_ia_i^x}$ for $x\in[0,1]$. This continuously transforms the one probability distribution into the other. If we can show that the derivative of the entropy with respect to $x$ at $x=0$ is negative, it follows that it is negative for all $x$, since we can rescale to shift any value of $x$ to $0$.
With $\frac{\mathrm d}{\mathrm dx}(f(x)\log f(x))=(1+\log f(x))f'(x)$, we have
\begin{eqnarray} && \left.\frac{\mathrm d}{\mathrm dx}\left(-\sum_i\frac{p_ia_i^x}{\sum_kp_ka_k^x}\log\frac{p_ia_i^x}{\sum_kp_ka_k^x}\right)\right|_{x=0} \\ &=& \left.-\sum_i\left(1+\log\frac{p_ia_i^x}{\sum_kp_ka_k^x}\right)\frac{p_ia_i^x}{\sum_kp_ka_k^x}\left(\log a_i-\frac{\sum_kp_ka_k^x\log a_k}{\sum_kp_ka_k^x}\right)\right|_{x=0} \\ &=& -\sum_i\left(1+\log p_i\right)p_i\left(\log a_i-\sum_kp_k\log a_k\right) \\ &=& -\left(E\left[\log p_i\log a_i\right]-E\left[\log p_i\right]E\left[\log a_i\right]\right) \\ &=&-\operatorname{Cov}(\log p_i,\log a_i)\;. \end{eqnarray}
This covariance is positive, since the $p_i$ and $a_i$ are both in ascending order. This in fact proves the more general result that the entropy is reduced as long as the logarithms of the $p_i$ and the $a_i$ are positively correlated.