What is the purpose of $\frac{1}{\sigma \sqrt{2 \pi}}$ in $\frac{1}{\sigma \sqrt{2 \pi}}e^{\frac{(-(x - \mu ))^2}{2\sigma ^2}}$?
If you consider every possible outcome of some event you should expect the probability of it happening to be $1$, not $\sqrt{2\pi}$ so the constant scales the distribution to conform with the normal convention of ascribing a probability between zero and one.
It is doing that, but observe that you are also stretching in the horizontal direction by the same factor (in the exponential). Say if $\sigma>1$ you are decreasing your area by a factor $\sigma$ but you are increasing it by the same factor because you replace $x$ by $x/\sigma$ (the shift does not change the area of course)
As you have correctly stated, the p.d.f. of the normal distribution is given by $$f(x\mid\mu,\sigma^2)=\frac1{\sigma\sqrt{2\pi}}\exp\left(-\frac12\left(\frac{x-\mu}\sigma\right)^2\right)$$ where the parameter space is $\mathit\Theta=\{(\mu,\sigma^2)\in\Bbb R^2:\sigma^2>0\}$. This is essentially saying that the mean is a value on the real line, and the variance is one on the positive real line.
Now consider the simple case where $\mu=0$ and $\sigma^2=1$. Then the standard normal distribution has p.d.f. $$f(x)=\frac1{\sqrt{2\pi}}\exp\left(-\frac12x^2\right).$$ If we integrate this in the interval $(-\infty,\infty)$, we will get $1$. This is by definition always the case as for all $x\in\mathit X$ (in this instance $\mathit X=\Bbb R$), $$\int_{\mathit X}f(x)\,dx=1.$$ That is, the sum of all the probabilities of $x$ being in each region in $\mathit X$ is $1$. In fact, the constant that makes this happen is so important in statistics (especially Bayesian statistics) that it is given a name: the normalising constant.
A further example is the Beta distribution, with p.d.f. $$f(x\mid\alpha,\beta)=\frac{x^{\alpha-1}(1-x)^{\beta-1}}{\text B(\alpha,\beta)}$$ where $1/\text B(\alpha,\beta)$ is the normalising constant.