Operator norm is equal to max eigenvalue
If $x$ is a unit eigenvector corresponding to eigenvalue $\lambda$, then $\|Ax\| = |\lambda|$, so $|\lambda| \le \|A\|$. So the inequality that you stated should be reversed.
If $A$ is diagonalizable with respect to an orthonormal basis $\{v_1, v_2\}$ with eigenvalues $\lambda_1$ and $\lambda_2$, then for any $x=c_1 v_1 + c_2 v_2$ we have $Ax = c_1 \lambda_1 v_1 + c_2 \lambda_2 v_2$ so $\frac{\|Ax\|^2}{\|x\|^2} = \frac{c_1^2 \lambda_1^2 + c_2 \lambda_2^2}{c_1^2 + c_2^2} \le \max\{\lambda_1^2, \lambda_2^2\}$, with equality if $x$ is an eigenvector corresponding to the largest eigenvalue (in absolute value). This situation happens whenever $A$ is symmetric, or more generally, normal ($A^\top A = A A^\top$).
An example for strict inequality is if $A$ is similar to a matrix of the form $\begin{bmatrix} 0 & 1 \\ & 0\end{bmatrix}$, in which case the eigenvalues are all zero, but the operator norm is $1$.
These facts can be generalized to arbitrarily sized square matrices as well.
It is not true that $||A|| \leq \max \{|\lambda|\}$ in general. The reverse inequality always holds: $Ax=\lambda x,x \neq 0$ implies $|\lambda| ||x|| \leq ||A|| ||x||$ so $|\lambda| \leq ||A||$ for any eigen value $\lambda$. Equality holds if $A$ is symmetric.