"If $1/a + 1/b = 1 /c$ where $a, b, c$ are positive integers with no common factor, $(a + b)$ is the square of an integer"

Let $\gcd(a,b) = g$, and $a = a'g$ and $b = b'g$ (so that $\gcd(a', b') = 1$). The equation $\frac1a + \frac1b = \frac1c$ is the same as $c(a+b) = ab$, which, dividing throughout by $g$, is $$c(a' + b') = a'b'g.$$

Now, as $(a' + b')$ divides $a'b'g$ but is relatively prime to both $a'$ and $b'$, it must divide $g$. Similarly, as $g$ divides $c(a'+b')$ but is relatively prime to $c$ (note that $\gcd(g, c) = \gcd(\gcd(a,b), c) = \gcd(a, b, c) = 1$) it must divide $(a' + b')$. Thus as both $g$ and $(a' + b')$ divide each other, we have $$(a' + b') = g,$$ and therefore $$(a + b) = g(a' + b') = g^2.$$


The hypothesis may be written as $(a+b)c = ab$, which is equivalent to $(a-c)(b-c)=c^2$.

Let $p$ be a prime factor of $c$. Then $p$ divides $a-c$ or $b-c$, but it cannot divide both because $\gcd(a,b,c)=1$. Hence $p^2$ divides $a-c$, say. This means that both $a-c$ and $b-c$ are squares since their product is a square: $a-c=u^2$, $b-c=v^2$. This implies that $c=uv$. But then $a+b-2c=u^2+v^2$ and so $a+b=(u+v)^2$.

The argument above means that all examples of $$ \frac 1 a + \frac 1 b = \frac 1 c $$ with $\gcd(a,b,c)=1$ are given by $$\frac1{u(u+v)}+\frac1{v(u+v)}=\frac1{uv}$$ with $\gcd(u,v)=1$.


The hypothesis may be rewritten as $$c(a+b) = ab.$$ Let $p$ be a prime factor of $a+b$.

Assume that $p\mid c$. If $p\mid a$ then $p\mid (a+b)-a = b$ which implies that $p$ is a common factor for $a, b, c$. Thus $p$ and $a$ are coprime. The same argument shows that $p$ and $b$ are coprime. Then $p$ and $ab$ are coprime. This contradicts the hypothesis $p \mid c$. Thus $p$ is coprime with $c$.

Now let denote by $\nu(x)$ the $p$-adic order of an integer $x$. From the first equality we get, thanks to $\gcd(p, c) = 1$, $$\nu(a+b) = \nu(ab).$$ This can be rewritten as $$\nu(a+b) = \nu(a) + \nu(b).$$ Thus $\nu(a) \leq \nu(a+b)$ i.e. $p^{\nu(a)} \mid (a+b) - a = b$, thus $\nu(a) \leq \nu(b)$. The same argument shows that $\nu(b) \leq \nu(a)$, hence $$\nu(a) = \nu(b).$$ Thus $$\nu(a+b) \in 2\mathbb{N}.$$

As this holds for any of its prime factors, $a+b$ is a square.