Deriving inverse of Hilbert matrix
Thanks everyone for the answers offered! After chasing down the various links, I came across a very similar comment by Deane Yang in 1991 (!), that offered an elegant outline of a proof. I felt it would be nice to flesh out the details of the proof. This proof doesn't use "Cholesky machinery", and it is possible to deduce that the entries of the inverse are integers without knowing the entries explicitly.
First, we note that $$H_{ij} = \frac{1}{i+j-1}= \int_{0}^{1}x^{i+j-2}dx= \int_{0}^{1}x^{i-1}x^{j-1}dx$$ We can treat $\int_{0}^{1}p(x)q(x)dx$ as an inner product, $ \langle p, q \rangle $, on the vector space $P_{n-1}(x)$ of polynomials of degree $ < n $.
$H_n$, the $n \times n$ Hilbert matrix, corresponds to the Gramian matrix of the set of vectors $\{ 1,x,x^2,...,x^{n-1} \} $.
Next, the Shifted Legendre Polynomials are given by $$\tilde{P_n}(x) = (-1)^n \sum_{k=0}^n {n \choose k} {n+k \choose k} (-x)^k$$
We note that these are polynomials with integer coefficients. They also have the property that $$\int_{0}^{1} \tilde{P_m}(x) \tilde{P_n}(x)dx = {1 \over {2n + 1}} \delta_{mn}$$
Also, $\tilde{P_m}(x)$ is a polynomial of degree $m$, so $\{ \tilde{P_0}(x), \tilde{P_1}(x), ... \tilde{P_{n-1}}(x) \}$ form an alternative basis of $P_{n-1}(x)$.
The change of basis matrix, $P$, (from the standard basis to this new basis) can be obtained by choosing the coefficient of the appropriate power of $x$ in the above explicit formula for the Legendre polynomials. We have $$ P_{ij} = (-1)^{i+j-1} {j-1 \choose i-1} {i+j-2 \choose i-1}$$ (i.e. replace $n$ by $j-1$ and $k$ by $i-1$ in the formula for $\tilde{P_n}$).
The Gramian matrix under this change of basis is $P^T H P$. But since the $\tilde{P_i}$'s are orthogonal, we get $$ (P^T H P)_{ii} = {1 \over 2i-1}$$.
Let this diagonal matrix be $D$. Since it is diagonal, its inverse is given by $$(D^{-1})_{ii} = \frac{1}{Dii} = 2i -1 $$
Then $$ P^T H P = D $$. So $$ H = (P^T)^{-1} D P^{-1} $$ and $$ H^{-1} = P D^{-1} P^T $$.
Since, $P$, $P^T$ and $D$ are all integer matrices, $H^{-1}$ is an integer matrix. $ \blacksquare $
A simple proof is in the paper "Fibonacci numbers and orthogonal polynomials" by Christian Berg.
It suffices to prove Schechter's formula for Cauchy matrix cited in Wikipedia (see link in Faisal's comment). We need to check $\sum_j b_{ij}a_{jk}=\delta_{ik}$, i.e. $$ \frac{A(y_i)}{B'(y_i)}\sum_j \frac{f(x_j)}{A'(x_j)}=-\delta_{i,k}, $$ where $f(t)=B(t)/((t-y_i)(t-y_k))$. If $i\ne k$, then $f$ is just a polynomial of degree $n-2$, and the inner sum is its coefficient in $t^{n-1}$ (this follows from Lagrange interpolation on points $x_1,\dots,x_n$). If $i\ne k$, then denote by $F$ the corresponding Lagrange polynomial $F(x)=\sum f(x_j) \frac{A(x)}{(x-x_j)A'(x_j)}$, we are searching for a coefficient of $F$ in $x^{n-1}$. We have $F(x_j)=f(x_j)$, so $F(x)(x-y_i)-\prod_{j\ne i}(x-y_i)$ vanishes for $x=x_1,x_2,\dots,x_n$. So, $F(x)(x-y_i)-\prod_{j\ne i}(x-y_i)=cA(x)$, and we find $c$ substituting $x=y_i$.