What is the expected maximum out of a sample (size N) from a geometric distribution?

Here's an answer: $\sum_{i=1}^n \binom{n}{i} (-1)^{i+1} \frac{1}{1-q^i}$, where $q = 1-p$. Maybe someone else can help you simplify that further.

The argument: Let $Y = \max\{X_1, X_2, \ldots, X_n\}$, where the $X_i$ are $n$ sampled values from the geometric distribution.

Then $E[Y] = \sum_{k=0}^{\infty} P(Y>k) = \sum_{k=0}^{\infty} P(X_1 > k \text{ or } X_2 > k \text{ or } \cdots \text{ or } X_n > k)$.

Applying the principle of inclusion-exclusion and the fact that the $X_i$ are iid, we have

$P(X_1 > k \text{ or } X_2 > k \text{ or } \cdots X_n > k)=$

$=\binom{n}{1} P(X_1 > k) - \binom{n}{2} P(X_1 > k)^2 \pm \cdots (-1)^{n+1} \binom{n}{n} P(X_1 > k)^n$.

Since $P(X_1 > k) = q^k$, this sum becomes $\sum_{i=1}^n \binom{n}{i} (-1)^{i+1} q^{ki}$.

Thus

$E[Y] = \sum_{k=0}^{\infty} \sum_{i=1}^n \binom{n}{i} (-1)^{i+1} q^{ki} = \sum_{i=1}^n \binom{n}{i} (-1)^{i+1} \sum_{k=0}^{\infty} (q^i)^k$

$= \sum_{i=1}^n \binom{n}{i} (-1)^{i+1} \frac{1}{1-q^i}.$


Bennett Eisenberg's paper "On the expectation of the maximum of IID geometric random variables" (Statistics and Probability Letters 78 (2008) 135-143) gives both my answer already accepted by the OP and mentions the infinite sum you get if you take Michael Lugo's comment on my first answer. However, the author also says, "these expressions are not so useful" and "There is no... simple expression for... the expected value of the maximum of $n$ IID geometric random variables." The point of his paper is "to use simple Fourier analysis to show that $E(M_n^*) - \sum_{k=1}^n \frac{1}{\lambda k}$ is very close to 1/2 not only for moderate values of $\lambda$, but also for relatively small values of $n$ and that this difference is logarithmically summable to 1/2 for all values of $\lambda$." Here, $E(M_n^*)$ is the expectation requested by the OP, and $\lambda$ is defined by $q = 1-p = e^{-\lambda}$. (The $\lambda$ is the parameter for the corresponding exponential distribution.) The practical value of Eisenberg's result, of course, is the fact that $\sum_{k=1}^n \frac{1}{k}$ can be closely approximated by $\log n + \gamma$.