unbiased estimate of the covariance
Additional Comment, after some thought, following an exchange of Comments with @MichaelHardy:
His answer closely parallels the usual demonstration that $E(S^2) = \sigma^2$ and is easy to follow. However, the proof below, in abbreviated notation I hope is not too cryptic, may be more direct.
$$(n-1)S_{xy} = \sum(X_i-\bar X)(Y_i - \bar Y) = \sum X_i Y_i -n\bar X \bar Y = \sum X_i Y_i - \frac{1}{n}\sum X_i \sum Y_i.$$
Hence,
$$(n-1)E(S_{xy}) = E\left(\sum X_i Y_i\right) - \frac{1}{n}E\left(\sum X_i \sum Y_i\right)\\ = n\mu_{xy} - \frac{1}{n}[n\mu_{xy} + n(n-1)\mu_x \mu_y]\\ = (n-1)[\mu_{xy}-\mu_x\mu_y] = (n-1)\sigma_{xy},$$
So the expectation of the sample covariance $S_{xy}$ is the population covariance $\sigma_{xy} = \operatorname{Cov}(X,Y),$ as claimed.
Note that $\operatorname{E}(\sum X_i \sum Y_i)$ has $n^2$ terms, among which $\operatorname{E}(X_iY_i) = \mu_{xy}$ and $\operatorname{E}(X_iY_j) = \mu_x\mu_y.$
Let $\mu=\operatorname{E}(X)$ and $\nu = \operatorname{E}(Y).$ Then \begin{align} & \sum_{i=1}^n (X_i - \bar X)(Y_i-\bar Y) \\[10pt] = {} & \sum_{i=1}^n \Big( (X_i - \mu) + (\mu - \bar X)\Big) \Big((Y_i - \nu) + (\nu - \bar Y)\Big) \\[10pt] = {} & \left( \sum_i (X_i-\mu)(Y_i-\nu) \right) + \left( \sum_i (X_i-\mu)(\nu - \bar Y) \right) \\ & {} +\left( \sum_i (\mu-\bar X)(Y_i - \nu) \right) + \left( \sum_i(\mu-\bar X)(\nu - \bar Y) \right). \end{align}
The expected value of the first of the four terms above is $$ \sum_{i}^n \operatorname{E}\big( (X_i-\mu)(Y_i-\nu) \big) = \sum_{i}^n \operatorname{cov}(X_i,Y_i) = n\operatorname{cov}(X,Y). $$ The expected value of the second term is \begin{align} & \sum_i -\operatorname{cov}(X_i, \bar Y) = \sum_i - \operatorname{cov}\left(X_i, \frac {Y_1+\cdots+Y_n} n \right) \\[10pt] = {} & -n\operatorname{cov}\left( X_1, \frac{Y_1+\cdots+Y_n} n \right) = - \operatorname{cov}(X_1, Y_1+\cdots +Y_n) \\[10pt] & = -\operatorname{cov}(X_1,Y_1) + 0 + \cdots + 0 = -\operatorname{cov}(X,Y). \end{align} The third term is similarly that same number.
The fourth term is \begin{align} & \sum_i \overbrace{\operatorname{cov}(\bar X,\bar Y)}^{\text{No “} i \text{'' appears here.}} = n \operatorname{cov}(\bar X, \bar Y) = n \operatorname{cov}\left( \frac 1 n \sum_i X_i, \frac 1 n \sum_i Y_i \right) \\[10pt] = {} & n \cdot \frac 1 {n^2} \Big( \, \underbrace{\cdots + \operatorname{cov}(X_i, Y_j) + \cdots}_{n^2\text{ terms}} \, \Big). \end{align} This last sum is over all pairs of indices $i$ and $j$. But the covariances are $0$ except the ones in which $i=j$. Hence there are just $n$ nonzero terms, and we have $$ n\cdot \frac 1 {n^2} \left( \sum_i \operatorname{cov} (X_i,Y_i) \right) = n\cdot \frac 1 {n^2} \cdot n \operatorname{cov}(X,Y) = \operatorname{cov}(X,Y). $$
I leave the rest as an exercise.
Just adding on top of the above post by @BruceET and expanding the last term (may be useful for someone):
$E\left[\left(\sum\limits_{i=1}^{n}X_i\right).\left(\sum\limits_{j=1}^{n}Y_j\right)\right]$ $=E\left[\sum\limits_{i=1}^{n}\sum\limits_{j=1}^{n}X_i.Y_j\right]$ $=E\left[\sum\limits_{(i,j=1\ldots n) \wedge (i=j)}X_i.Y_j+\sum\limits_{(i,j=1\ldots n) \wedge (i\neq j)}X_i.Y_j\right]$ $=E\left[\sum\limits_{i=1}^{n}X_i.Y_i\right]+E\left[\sum\limits_{(i,j=1\ldots n) \wedge (i\neq j)}X_i.Y_j\right]$
$=\sum\limits_{i=1}^{n}E\left[X_i.Y_i\right]+\sum\limits_{(i,j=1\ldots n) \wedge (i\neq j)}E\left[X_i.Y_j\right]$ (by linearity of expectation)
$=n\mu_{XY}+n(n-1)\mu_X\mu_Y$ (Since $X_i \perp \!\!\! \perp Y_j$ for $i \neq j$)