Are we guaranteed that the harmonic series minus infinite random terms always converge?

For any such sequence $(a_n)$, the corresponding sequence $(b_n)$ given by $b_n=1-a_n$ is just as likely. So according to your intuition, the series $$\sum_{n=1}^\infty b_n\frac{1}{n}$$ should converge almost surely. But then $$\sum_{n=1}^\infty \frac{1}{n} = \sum_{n=1}^\infty a_n\frac{1}{n} + \sum_{n=1}^\infty b_n\frac{1}{n}$$ would converge too!


Your series is almost surely divergent (i.e., it diverges with probability $1$). Let $X_n = a_n/n$ be the $n$th term. Then $$ E[X_n]=\frac{1}{2n} $$ and $$ {\text{Var}}[X_n]=E[X_n^2]-E[X_n]^2=\frac{1}{4n^2}. $$ The variance of the partial sums is therefore $$ {\text{Var}}\left[\sum_{i=1}^{n}X_i\right]=\frac{1}{4}\sum_{i=1}^{n}\frac{1}{i^2}\rightarrow\frac{\pi^2}{24}; $$ the expectation value of the partial sums, on the other hand, is $$ E\left[\sum_{i=1}^{n}X_i\right]=\frac{1}{2}\sum_{i=1}^{n}\frac{1}{i}=\frac{1}{2}H_n \sim \frac{1}{2}\log n. $$ So the sequence of partial sums diverges logarithmically with probability $1$.


As I'm getting some down votes, I wanted to add a few steps to show that this argument is sound. I've got random variables $S_n$ such that $\mu_n = E[S_n]\rightarrow\infty$ and $\sigma_n^2={\text{Var}}[S_n]\rightarrow \sigma^2 < \infty$. I claim that $S_n$ is unbounded with probability $1$. If $S_n$ is bounded, then for some $x\in\mathbb{R}$ and some $N\in\mathbb{N}$, $S_n \le x$ for all $n\ge N$. In particular, for any sequence of indices $n_i\rightarrow\infty$ and sequence of bounds $A_i\rightarrow\infty$, $$ S_n {\text { bounded}} \implies S_{n_1} < A_1 \vee S_{n_2} < A_2 \vee \ldots, $$ and hence $$ {\text{Pr}}[S_n{\text { bounded}}] \le {\text{Pr}}[S_{n_1}<A_1] + {\text{Pr}}[S_{n_2}<A_2] + \ldots. $$ Let $M>0$ be a fixed, large number. Because $\mu_n\rightarrow\infty$ and $\sigma^2_n\rightarrow\sigma^2$, we can choose $n_i$ large enough that $\mu_{n_i} > 2^{i/2+1} M \sigma_{n_i}$ and $n_i\rightarrow\infty$; and choose $A_i=\mu_{n_i} - 2^{i/2} M\sigma_{n_{i-1}}$. But $$ {\text{Pr}}[S_{n_i} < \mu_{n_i} - 2^{i/2} M\sigma_{n_{i-1}}] \le {\text{Pr}}\left[\frac{(S_{n_i}-\mu_{n_i})^2}{\sigma^2_{n_{i-1}}} > 2^{i} M^2\right] \le \frac{1}{2^{i}M^2}, $$ so $$ {\text{Pr}}[S_n{\text { bounded}}] \le \sum_{i=1}^{\infty}\frac{1}{2^{i}M^2} = \frac{1}{M^2}. $$ Since $M$ was arbitrary, we can choose it to be as large as we like; we conclude that ${\text{Pr}}[S_n{\text { bounded}}]=0$, and hence that $S_n$ converges with probability $0$ as well. In our case, we need only the additional fact that $S_n$ is monotonically increasing to conclude that $S_n$ a.s. diverges to infinity (since that's the only other option).


Why would it ever converge ?

Taking the expectation,

$$E\left(\sum_{n=1}^\infty a_n\frac{1}{n}\right)=\sum_{n=1}^\infty E(a_n)\frac{1}{n}=\frac12\sum_{n=1}^\infty \frac{1}{n}.$$

Even if the argument isn't rigorous because none of these series converge, it should be enough to shed doubt.


In the harmonic series, removing all but every millionth term is not enough to restore convergence, because $\sum\frac1{1000000n}=\frac1{1000000}\sum\frac1n$.