Augmented Lagrangian
I'll assume $f$ is convex. (I should probably also assume $f$ is closed and proper.)
As you said, the Lagrangian is $L(x,y) = f(x) + \langle y, Ax - b \rangle$. The dual function is \begin{align*} g(y) &= \inf_x \, L(x,y) \\ &= \inf_x \, f(x) - \langle -A^Ty, x \rangle - \langle y, b \rangle \\ &= - \sup_x \, \langle -A^Ty, x \rangle - f(x) + \langle y, b \rangle \\ &= - f^*(-A^T y) - \langle y, b \rangle. \end{align*} Here $f^*$ is the convex conjugate of $f$.
The dual problem, expressed as a minimization problem, is \begin{equation*} \text{minimize} \quad f^*(-A^T y) + \langle y, b \rangle. \end{equation*}
Note that the dual function might not be differentiable! That is a key point. To guarantee that $f^*$ is differentiable, we need to assume that $f$ is strictly convex, which is often not the case. This prevents us (often) from simply solving the dual problem with gradient ascent.
So what can we do if the dual function is not differentiable? How can we minimize a nondifferentiable function? One option is to use the proximal point method. The proximal point method does not require the objective function to be differentiable.
If you work out the details, you will find that the augmented Lagrangian method is what you get when you solve the dual problem with the proximal point method.
Vandenberghe's 236c notes are a good source for this material.