Proving Hölder's Inequality
Suppose $\displaystyle\int_a^b \left|f(x)\right|^p d\alpha\neq 0$ and $\displaystyle\int_a^b \left|g(x)\right|^q d\alpha\neq 0$. Otherwise, if $\displaystyle\int_a^b \left|f(x)\right|^p d\alpha=0$, then $f\equiv 0$ a.e. and the Holder's inequality is trivial in this case.
Now applying Young's inequality with $u=\displaystyle\frac{|f(x)|}{(\int_a^b \left|f(x)\right|^p d\alpha)^{\frac{1}{p}}}$ and $v=\displaystyle\frac{|g(x)|}{(\int_a^b \left|g(x)\right|^q d\alpha)^{\frac{1}{q}}}$, we have $$\frac{|f(x)|}{(\int_a^b \left|f(x)\right|^p d\alpha)^{\frac{1}{p}}}\cdot\frac{|g(x)|}{(\int_a^b \left|g(x)\right|^q d\alpha)^{\frac{1}{q}}}\leq\frac{1}{p}\frac{|f(x)|^p}{\int_a^b \left|f(x)\right|^p d\alpha}+\frac{1}{q}\frac{|g(x)|^q}{\int_a^b \left|g(x)\right|^q d\alpha}.$$ Integrating it from $a$ to $b$ with respect to $d\alpha$, we obtain $$\frac{\int_a^b|f(x)||g(x)|d\alpha}{(\int_a^b \left|f(x)\right|^p d\alpha)^{\frac{1}{p}}(\int_a^b \left|g(x)\right|^q d\alpha)^{\frac{1}{q}}}\leq\frac{1}{p}+\frac{1}{q}=1$$ which implies that $$ \tag{1}\int_a^b|f(x)||g(x)|d\alpha\leq\left(\int_a^b \left|f(x)\right|^p d\alpha \right)^{1/p} \left(\int_a^b \left|g(x)\right|^q d\alpha \right)^{1/q}.$$ Now the inequality which we want to prove follows from $(1)$ and the inequality $$\left|\int_a^b f(x)g(x)d\alpha\right|\leq\int_a^b|f(x)||g(x)|d\alpha.$$
I think following might be a way to come up with the proof of H$\ddot { o } $lder's inequality.
First, it's easy to show that $$f=-log(x)$$ is a convex function. (a function $f$ is convex if and only if $dom$ $f$ is convex and its Hessian is positive semidefinite: for all $x\in$$dom$ $f$).$${ \triangledown }^{ 2 }f(x)\ge 0$$
Then according to the definition of convex function: $$f(\theta a+(1-\theta )b)\le \theta f(a)+(1-\theta )f(b)$$ for all $a,b\in$$dom$ $f$, and $0\le \theta \le 1$
We will have: $$-log(\theta a+(1-\theta )b)\le -\theta log(a)-(1-\theta )log(b)$$ for $a,b\ge 0$
next take the exponential of both sides yields:$${ a }^{ \theta }{ b }^{ 1-\theta }\le \theta a+(1-\theta )b$$
applying this with:$$a=\frac { { \left| f(x) \right| }^{ p } }{ \int _{ a }^{ b }{ { \left| f(x) \right| }^{ p } } }, b=\frac { { \left| g(x) \right| }^{ q } }{ \int _{ a }^{ b }{ { \left| g(x) \right| }^{ q } } }, \theta =1/p$$
yields $$\frac { \left| f(x) \right| }{ { (\int _{ a }^{ b }{ { \left| f(x) \right| }^{ p }d\alpha } ) }^{ \frac { 1 }{ p } } } \cdot \frac { \left| g(x) \right| }{ { (\int _{ a }^{ b }{ { \left| g(x) \right| }^{ q }d\alpha } ) }^{ \frac { 1 }{ q } } } \le \frac { 1 }{ p } \frac { \left| f(x) \right| ^{ p } }{ \int _{ a }^{ b }{ { \left| f(x) \right| }^{ p }d\alpha } } +\frac { 1 }{ q } \frac { \left| g(x) \right| ^{ q } }{ \int _{ a }^{ b }{ { \left| g(x) \right| }^{ q }d\alpha } } $$
Finally, integrate it from a to b with respect to dα will obtain $H\ddot { o } lder$'s inequality.
Reference: 《Convex Optimization》Stephen Boyd and Lieven Vandenberghe $Chapter3,p78$