Proof of nonnegativity of KL divergence using Jensen's inequality
That follows from a rather trivial generalization of Jensen inequality:
Let $f,g:\mathbb{R} \to \mathbb{R}$ with $f(\cdot)$ convex. Then $E[f(g(X))] \ge f (E[g(X)])$
The proof is simple: apply the Jensen inequality to the random variable $Y=g(X)$. Notice that no convexity condition (actually, no condition at all) is required for the function $g$. But also notice that it's only the (convex) function $f$ the one that "goes outside the expectation" in the inequality.
In your case, take $f(x) = \log(x)$ (concave) and $g(x)=q(x)/p(x)$ (further: don't let the fact that in $g(x)=q(x)/p(x)$ $q$ and $p$ are densities confuse you; that does not matter at all).