Do you know an elegant proof for this expression for a Schur function?
I would like to suggest an interpretation using super symmetric functions. These are symmetric functions that are symmetric in two sets of variables $\{x_i\}$ and $\{y_j\}$ separately. They satisfy the property that setting a single $x_i$ variable to equal $z$ and a single $y_j$ variable equal to $z$ gives a polynomial independent of $z$, in addition to several other conditions (see Chapter 1, Section 3, Exercise 23 of Macdonald's book). Note that since it's independent of $z$, it's the same as just setting $x_i = y_j = 0$, which amounts to considering the same super symmetric function in a smaller set of variables.
A concrete description of supersymmetric functions is as follows. The supersymmetric functions are the image of the map $\Lambda \to \Lambda \otimes \Lambda$ given by $(\mathrm{Id} \otimes S) \circ \Delta$, where $\Lambda$ is the ring of symmetric functions, $\Delta$ is the usual comultiplication, and $S$ is the usual antipode (which sends the power sum $p_i(y)$ to $-p_i(y)$). The variables in the second factor (i.e. $\{y_j\}$) are the "super variables". Because the map is injective, the supersymmetric functions are abstractly isomorphic to $\Lambda$, but they are convenient for performing certain specialisation tricks. But let me give an example: $$ \sum_{r=0}^\infty e_r(x/y) t^r = \frac{\prod_i(1+x_i t)}{\prod_j (1+y_j t)}. $$ These are the elementary super symmetric functions. You can see that if you set a single $x_i = z$ and a single $y_j = z$, then a factor of $1+zt$ cancels in the numerator and denominator, giving an expression independent of $z$ as stated above. Another example is that $p_i(x/y) = p_i(x) - p_i(y)$.
Now we're in a situation where we can explain the identity. Clearly $s_{\mu}(x) = s_\mu(x,1/1)$ by the specialisation property of super symmetric functions. But the right hand side can be obtained by first setting an $x_i$ equal to 1, then passing from $\Lambda$ to super symmetric functions, then setting a $y_j=1$ and all other $y$'s to zero (that setting $x_i=1$ commutes with passing to super symmetric functions isn't totally obvious, but I'm omitting it for the sake of brevity). The upshot of this is that $s_\mu(x,1) = \sum_\lambda s_\lambda(x) s_{\mu/\lambda}(1)$, and $s_{\mu/\lambda}(1)$ is zero unless $\mu/\lambda$ is a horizontal strip. This gives the outer sum in your expression. Then we recognise that second part of the operation replaces $p_i(x)$ with $p_i(x/1) = p_i(x) - p_i(1) = p_i(x)-1$, which recovers your expression, where I have taken for granted that $$ s_\lambda(x) = \sum_{\alpha \vdash |\lambda|} \frac{\chi_\alpha^\lambda}{z_\alpha}\prod_i p_i^{a_i}. $$
Here is another argument. Let $\chi^{\mu/k}$ denote the skew character of the symmetric group $\mathfrak{S}_n$ corresponding to the skew shape $\mu/k$. Then by Pieri's rule, $$ \sum_{\mu-\lambda\ \mathrm{is\ a\ horizontal\ strip}} \sum_{\alpha\vdash|\lambda|}\frac{\chi^\lambda_\alpha}{z_\alpha} \prod_i p_i^{a_i} = \sum_k \sum_{\alpha\vdash n-k} \frac{\chi^{\mu/k}_\alpha}{z_\alpha}p_\alpha $$ $$ \qquad = \sum_k s_{\mu/k}. $$ Let $s_k^\perp$ denote the linear operator taking $s_\lambda$ to $s_{\lambda/k}$. Let $\psi$ denote the linear operator taking $f$ to $\sum_k s_k^\perp f$. Let $\theta$ denote the algebra automorphism taking $p_i$ to $p_i-1$.
I claim that $\psi$ and $\theta$ are inverses (so in fact $\psi$ is an algebra automorphism). By linearity it suffices to show that $$ \psi p_\lambda = \prod_i (p_{\lambda_i}+1).\ \ \ (*) $$ Now it is not hard to see that $s_k^\perp p_\lambda$ is equal to $\sum p_\nu$, where $\nu$ is obtained from $\lambda$ by removing a set of parts (regarding equal parts as distinguishable) summing to $k$. From this (*) is immediate.
We therefore have $$ \left.\sum_k s_{\mu/k}\right|_{p_i\to p_i-1} = \theta\psi s_\mu = s_\mu, $$ as desired.