Proving that a symmetric matrix is positive definite iff all eigenvalues are positive
If $A$ is symmetric and has positive eigenvalues, then, by the spectral theorem for symmetric matrices, there is an orthogonal matrix $Q$ such that $A = Q^\top \Lambda Q$, with $\Lambda = \text{diag}(\lambda_1,\dots,\lambda_n)$. If $x$ is any nonzero vector, then $y := Qx \ne 0$ and
$$
x^\top A x = x^\top (Q^\top \Lambda Q) x = (x^\top Q^\top) \Lambda (Q x) = y^\top \Lambda y = \sum_{i=1}^n \lambda_i y_i^2 > 0
$$
since $y$ is nonzero and $A$ has positive eigenvalues.
Conversely, suppose that $A$ is positive definite and that $Ax = \lambda x$, with $x \ne 0$. WLOG, we may assume that $x^\top x = 1$. Thus, $$0 < x^\top Ax = x^\top (\lambda x) = \lambda x^\top x = \lambda, $$ as desired.
This is a (sketch of a) proof when the symmetric matrix $A$ is real. Let $u_1, \ldots, u_n$ be the linearly independent eigenvectors which correspond to the positive eigenvalues $\lambda_1, \ldots, \lambda_n$ of the real symmetric matrix $A$. Also, let $z = c_1 u_1 + \cdots + c_nu_n$ be a random $n \times 1 $ real vector with $z\neq \vec 0$. Thus, we have:
$z^TAz \begin{array}[t]{l}= (c_1 u_1^T + \cdots +c_n u_n^T) P D P^{-1}(c_1 u_1 + \cdots +c_n u_n)\\\\ = \begin{pmatrix} c_1 \|u_1\|^2_2 & c_2 \|u_2\|^2_2& \cdots & c_n \|u_n\|^2_2 \end{pmatrix}\cdot \begin{pmatrix} \lambda_1 & 0 & \cdots & 0\\ 0 & \lambda_2 & \cdots & 0\\ \vdots & \vdots & \ddots & \vdots\\ 0 & 0 &\cdots & \lambda_n \end{pmatrix}\cdot \begin{pmatrix} c_1\|u_1\|^2_2 \\ c_2\|u_2\|^2_2 \\ \vdots \\ c_n\|u_n\|^2_2 \end{pmatrix}\\ =\lambda_1 c_1^2 +\cdots + \lambda_n c_n^2, \end{array}$
which is clearly positive due to the positive $\lambda_i$'s and it also holds $\|u_i\|_2^2 =u_i^T\cdot u_i= 1$.
I think you can fill in the details.
Just to clear things out..
Because matrix $A$ is a real symmetric one, it can be written in the form $$A = P \cdot D \cdot P^{-1} = P \cdot D \cdot P^T,$$ where the columns of $P$ contain the right hand eigenvectors of matrix $A$ and $P^{-1} (= P^T$) contain the left hand eigenvectors as its rows. Thus, if $u_i$ 's are the right hand eigenvectors, then $u_i^T $'s are the left hand eigenvectors of $A$. Then, it holds: $$u_i^T \cdot u_j = \begin{cases} 1, & i = j\\ 0,& i\neq j \end{cases}$$