Is there a series where the terms tend to zero faster than the harmonic series but it still diverges?
There is no slowest divergent series. Let me take this to mean that given any sequence $a_n$ of positive numbers converging to zero whose series diverges, there is a sequence $b_n$ that converges to zero faster and the series also diverges, where "faster" means that $\lim b_n/a_n=0$. In fact, given any sequences of positive numbers $(a_{1,n}), (a_{2,n}),\dots$ with each $\sum_n a_{i,n}=\infty$ and $\lim_n a_{i+1,n}/a_{i,n}=0$, there is $(a_n)$ with $\sum a_n=\infty$ and $\lim_n a_n/a_{i,n}=0$ for all $i$.
To see this, given $a_1,a_2,\dots$, first define $b_1=a_1,b_2=a_2,\dots,b_k=a_k$ until $a_1+\dots+a_k>1$. Second, let $b_{k+1}=a_{k+1}/2,b_{k+2}=a_{k+2}/2,\dots,b_n=a_n/2$ until $a_{k+1}+\dots+a_n>2$, etc. That is, we proceed recursively; if we have defined $b_1,\dots,b_m$ and $b_m=a_m/2^r$, and $b_1+\dots+b_m>r+1$, let $b_{m+1}=a_{m+1}/2^{r+1},\dots,b_l=a_l/2^{r+1}$ until $a_{m+1}+\dots+a_l>2^{r+1}$. The outcome is that $\sum b_i=\infty$ and $\lim b_i/a_i=0$.
Similarly, given $(a_{1,n}),(a_{2,n}),\dots$, with each $(a_{k+1,n})$ converging to zero faster than $(a_{k,n})$, and all of them diverging, let $a_i=a_{1,i}$ for $i\le n_1$, where $a_{1,1}+\dots+a_{1,n_1}>1$, then $a_i=a_{2,i}$ for $n_1<i\le n_2$, where we ask both that $a_{2,n_1+1}+\dots+a_{2,n_2}>1$ and that for any $k>n_2/2$ we have $a_{2,k}/a_{1,k}<1/2$, etc. That is, if we have defined $n_k$, we let $a_i=a_{k+1,i}$ for $n_k<i\le n_{k+1}$ where $n_{k+1}$ is chosen so that $a_{k+1,n_k+1}+\dots+a_{k+1,n_{k+1}}>1$ and for all $l>n_{k+1}/2$ we have $a_{k+1,l}/a_{i,l}<1/2^{k+1}$ for all $i<k+1$. Then the series $\sum a_i$ diverges, and the sequence $(a_i)$ converges to $0$ faster than all the $a_{i,n}$. In modern language, we would say that there are no $(\omega,0)$-gaps in a certain partial order.
We can modify the above slightly so that given any sequences $(a_{i,n})$ with $\sum_n a_{i,n}<\infty$ and $\lim_n a_{i+1,n}/a_{i,n}=\infty$ for all $i$, we can find $(a_n)$ with $\sum_n a_n<\infty$ and $\lim_n a_n/a_{i,n}=\infty$, so there is no fastest convergent series, and not even considering a sequence of faster and faster convergent series is enough. (In modern terms, there is no $(0,\omega)$-gap.)
What we cannot do in general is, given $(a_{i,n})$, with all $\sum_n a_{i,n}=\infty$, find $(a_n)$ with $\sum a_n=\infty$ and $a_n/a_{i,n}\to0$ for all $i$, if the $a_{i,n}$ are not ordered so that $a_{i+1,n}$ converges to zero faster than $a_{i,n}$. For example, we can have $a_n=1/n$ if $n$ is odd and $a_n=1/n^2$ if $n$ is even, and $b_n=1/n^2$ if $n$ is odd and $b_n=1/n$ if $n$ is even, and if $c_n$ converges to zero faster than both, then $\sum c_n$ converges. (These exceptions can typically be fixed by asking monotonicity of the sequences, which is how these results are usually presented in the literature.)
Note that the argument I gave is completely general, no matter what the series involved. For specific series, of course, nice "formulas" are possible. For example, given $a_n=1/n$, we can let $b_n=1/(n\log n)$ for $n>1$. Or $c_n=1/(n\log n\log\log n)$, for $n\ge 3$. Or ... And we can then "diagonalize" against all these sequences as indicated above.
By the way, the first person to study seriously the boundary between convergence and divergence is Paul du Bois-Reymond. He proved a version of the result I just showed above, that no "decreasing" sequence of divergent series "exhausts" the divergent series in that we can always find one diverging and with terms going to zero faster than the terms of any of them. A nice account of some of his work can be found in the book Orders of Infinity by Hardy. Du Bois-Reymond's work was extended by Hadamard and others. What Hadamard proved is that given $(a_i)$ and $(b_i)$ with $\sum a_i=\infty$, $\sum b_i<\infty$, and $b_i/a_i\to 0$, we can find $(c_i),(d_i)$ with $c_i/a_i\to0$, $b_i/d_i\to 0$, $d_i/c_i\to0$, $\sum c_i=\infty$, $\sum d_i<\infty$. More generally:
If we have two sequences of series, $(a_{1,n}), (a_{2,n}),\dots$ and $(b_{1,n}),(b_{2,n}),\dots$, such that
- Each $(a_{i+1,n})$ converges to zero faster than the previous one,
- Each $(b_{i+1,n})$ converges to zero slower than the previous one,
- Each $(a_{i,n})$ converges to zero slower than all the $(b_{j,n})$,
- Each $\sum_n a_{i,n}$ diverges, and
- Each $\sum_n b_{i,n}$ converges,
then we can find sequences $(c_n),(d_n)$, "in between", with one series converging and the other diverging.
In modern language, we say that there are no $(\omega,\omega)$-gaps, and similarly, there are no $(\omega,1)$- or $(1,\omega)$-gaps. This line of research led to some of Hausdorff's deep results in set theory, such as the existence of so-called $(\omega_1,\omega_1)$- or Hausdorff gaps. What Hausdorff proved is that this "interpolation" process, which can be iterated countably many times, cannot in general be carries out $\omega_1$ times, where $\omega_1$ is the first uncountable ordinal.
I had wondered about this too a long time ago and then came across this. The series
$$\sum_{n=3}^{\infty}\frac{1}{n\ln n (\ln\ln n)}=\infty$$
diverges and it can be very easily proven by the integral test. But here is the kicker. This series actually requires googolplex number of terms before the partial sum exceeds 10. Talk about slow! It is only natural that if the natural log is slow, then take the natural log of the natural log to make it diverge even slower.
Here is another one. This series
$$\sum_{n=3}^{\infty}\frac{1}{n\ln n (\ln\ln n)^2}=38.43...$$
actually converges using the same exact (integral) test. But it converges so slowly that this series requires $10^{3.14\times10^{86}}$ before obtaining two digit accuracy. So using these iterated logarithms you can come up with a series which converges or diverges "arbitrarily slowly".
Reference: Zwillinger, D. (Ed.). CRC Standard Mathematical Tables and Formulae, 30th ed. Boca Raton, FL: CRC Press, 1996.
If we use the notion of a partial sum:
$$S_n = \sum_{k=1}^n a_k$$
you are asking for a series in which the partial sums diverge more slowly than $\sum_{k=1}^n 1/k$ as $n\to\infty$. For this to happen, we just need to find $a_k$ such that $a_k < 1/k$ for all $k>c$ where $c$ is some value.
Look at $a_k = \frac{1}{k\ln k}$. Let's do the integral test to show this diverges. The substitution used is $u = \ln k$ so that $du = dk/k$.
$$\int \frac{dk}{k\ln k}=\int \frac{du}{u}=\ln u = \ln {\ln k}$$
Putting in limits of $2$ and $\infty$ shows that this indeed diverges. And for $k>c=e$, we have that $1/(k\ln k) < 1/k$.