Conditional expectation of random variable given a sum
Let $S_n = \sum\limits_{i=1}^n X_i$
Since for all $j:1\leq j\leq n$, the random variables $X_j$ are independent and identically distributed, then $\mathsf E(X_j\mid S_n)$ all have the same values. It is a matter of symmetry. $$\begin{align}\mathsf E(X_j\mid S_n) &= \tfrac 1n\sum_{i=1}^n\mathsf E(X_i\mid S_n) &&\text{Symmetry, }\forall j\in\{1..n\}\\[1ex] & = \tfrac 1n\mathsf E(\sum_{i=1}^n X_i\mid S_n) && \text{Linearity of Expectation}\\[1ex] &= \tfrac 1n \mathsf E(S_n\mid S_n) && \text{by definition of } S_n\\[1ex] & = \tfrac 1n S_n &&\text{clearly }\mathsf E(S_n\mid S_n)=S_n \\[2ex]\therefore\quad\mathsf E(X_j\mid \sum_{i=1}^n X_i) & = \tfrac 1n\sum_{i=1}^n X_i&&\text{when }{(X_j)}_{j\in\{1..n\}}\text{ are iid.} \end{align}$$ That is all you need.
Let $P(x_1,x_2,\ldots,x_n)$ be the $n$-variate probability density function of $X_1,X_2,\ldots,X_n$. Then the conditional expectation value
$$E_j(S)\equiv\mathrm{E}(X_j|\textstyle\sum_{i\!}X_i=S)=\frac{\displaystyle\int x_jP(x_1,x_2,\ldots,x_n)\delta(x_1+\cdots+x_n-S)d^nx}{\displaystyle\int P(x_1,x_2,\ldots,x_n)\delta(x_1+\cdots+x_n-S)d^nx}.$$
Let's assume $P(x_1,x_2,\ldots,x_n)=P(x_2,x_3,\ldots,x_1)$ is cyclic invariant. Then this leads to
$$E_1(S)=E_2(S)=\cdots=E_n(S).$$
Also we have from the integral and delta function
$$\sum_jE_j(S)=S.$$
Therefore $E_1(S)=E_2(S)=\cdots=E_n(S)=S/n.$