Show that the determinant of $A$ is equal to the product of its eigenvalues
Suppose that $\lambda_1, \ldots, \lambda_n$ are the eigenvalues of $A$. Then the $\lambda$s are also the roots of the characteristic polynomial, i.e.
$$\begin{array}{rcl} \det (A-\lambda I)=p(\lambda)&=&(-1)^n (\lambda - \lambda_1 )(\lambda - \lambda_2)\cdots (\lambda - \lambda_n) \\ &=&(-1) (\lambda - \lambda_1 )(-1)(\lambda - \lambda_2)\cdots (-1)(\lambda - \lambda_n) \\ &=&(\lambda_1 - \lambda )(\lambda_2 - \lambda)\cdots (\lambda_n - \lambda) \end{array}$$
The first equality follows from the factorization of a polynomial given its roots; the leading (highest degree) coefficient $(-1)^n$ can be obtained by expanding the determinant along the diagonal.
Now, by setting $\lambda$ to zero (simply because it is a variable) we get on the left side $\det(A)$, and on the right side $\lambda_1 \lambda_2\cdots\lambda_n$, that is, we indeed obtain the desired result
$$ \det(A) = \lambda_1 \lambda_2\cdots\lambda_n$$
So the determinant of the matrix is equal to the product of its eigenvalues.
I am a beginning Linear Algebra learner and this is just my humble opinion.
One idea presented above is that
Suppose that $\lambda_1,\ldots \lambda_2$ are eigenvalues of $A$.
Then the $\lambda$s are also the roots of the characteristic polynomial, i.e.
$$\det(A−\lambda I)=(\lambda_1-\lambda)(\lambda_2−\lambda)\cdots(\lambda_n−\lambda)$$.
Now, by setting $\lambda$ to zero (simply because it is a variable) we get on the left side $\det(A)$, and on the right side $\lambda_1\lambda_2\ldots \lambda_n$, that is, we indeed obtain the desired result
$$\det(A)=\lambda_1\lambda_2\ldots \lambda_n$$.
I dont think that this works generally but only for the case when $\det(A) = 0$.
Because, when we write down the characteristic equation, we use the relation $\det(A - \lambda I) = 0$ Following the same logic, the only case where $\det(A - \lambda I) = \det(A) = 0$ is that $\lambda = 0$. The relationship $\det(A - \lambda I) = 0$ must be obeyed even for the special case $\lambda = 0$, which implies, $\det(A) = 0$
UPDATED POST
Here i propose a way to prove the theorem for a 2 by 2 case. Let $A$ be a 2 by 2 matrix.
$$ A = \begin{pmatrix} a_{11} & a_{12}\\ a_{21} & a_{22}\\\end{pmatrix}$$
The idea is to use a certain property of determinants,
$$ \begin{vmatrix} a_{11} + b_{11} & a_{12} \\ a_{21} + b_{21} & a_{22}\\\end{vmatrix} = \begin{vmatrix} a_{11} & a_{12}\\ a_{21} & a_{22}\\\end{vmatrix} + \begin{vmatrix} b_{11} & a_{12}\\b_{21} & a_{22}\\\end{vmatrix}$$
Let $ \lambda_1$ and $\lambda_2$ be the 2 eigenvalues of the matrix $A$. (The eigenvalues can be distinct, or repeated, real or complex it doesn't matter.)
The two eigenvalues $\lambda_1$ and $\lambda_2$ must satisfy the following condition :
$$\det (A -I\lambda) = 0 $$ Where $\lambda$ is the eigenvalue of $A$.
Therefore, $$\begin{vmatrix} a_{11} - \lambda & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = 0 $$
Therefore, using the property of determinants provided above, I will try to decompose the determinant into parts.
$$\begin{vmatrix} a_{11} - \lambda & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} - \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} - \lambda\\\end{vmatrix}= \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22}\\\end{vmatrix} - \begin{vmatrix} a_{11} & a_{12} \\ 0 & \lambda \\\end{vmatrix}-\begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} - \lambda\\\end{vmatrix}$$
The final determinant can be further reduced.
$$ \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} \\\end{vmatrix} - \begin{vmatrix} \lambda & 0\\ 0 & \lambda\\\end{vmatrix} $$
Substituting the final determinant, we will have
$$ \begin{vmatrix} a_{11} - \lambda & a_{12} \\ a_{21} & a_{22} - \lambda\\\end{vmatrix} = \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22}\\\end{vmatrix} - \begin{vmatrix} a_{11} & a_{12} \\ 0 & \lambda \\\end{vmatrix} - \begin{vmatrix} \lambda & 0 \\ a_{21} & a_{22} \\\end{vmatrix} + \begin{vmatrix} \lambda & 0\\ 0 & \lambda\\\end{vmatrix} = 0 $$
In a polynomial $$ a_{n}\lambda^n + a_{n-1}\lambda^{n-1} ........a_{1}\lambda + a_{0}\lambda^0 = 0$$ We have the product of root being the coefficient of the term with the 0th power, $a_{0}$.
From the decomposed determinant, the only term which doesn't involve $\lambda$ would be the first term
$$ \begin{vmatrix} a_{11} & a_{12} \\ a_{21} & a_{22} \\\end{vmatrix} = \det (A) $$
Therefore, the product of roots aka product of eigenvalues of $A$ is equivalent to the determinant of $A$.
I am having difficulties to generalize this idea of proof to the $n$ by $$ case though, as it is complex and time consuming for me.
The approach I would use is to Decompose the matrix into 3 matrices based on the eigenvalues.
Then you know that the $det(A*B) = det(A)*det(B)$, and that $det(inv(A)) = \dfrac{1}{det(A)}$.
You can probably fill in the rest of the details from the article, depending on how rigorous your proof needs to be.
Edit: I just realized this won't work on all matrices, but it might give you an idea of an approach.