Is $e^x=\exp(x)$ and why?
If one first defines $$e:=\lim\limits_{n\to\infty} \left(1+\frac 1n\right)^n$$ and then goes on defining a real-valued function $$\exp:\mathbb R\rightarrow\mathbb R,~x\mapsto\exp(x):=e^x,$$ one has $e^x=\exp(x)$ by definition but then has to show $\displaystyle e^x=\sum\limits_{k=0}^\infty \frac{x^k}{k!}$. This involves finding the derivative of $\exp$ without the (easy) approach using power series. One also needs to define what $e^x$ means for $x\in\mathbb R\setminus\mathbb Q$.
If one instead defines $$ e:=\lim\limits_{n\to\infty} \left(1+\frac 1n\right)^n$$ and then chooses the approach to define $$\displaystyle\exp(x):=\sum\limits_{k=0}^\infty \frac{x^k}{k!}$$ via power series, we don't get $e^x=\exp(x)$ by definition and need to prove it. So either way there is a result that needs to be shown.
Of course there is no such thing as right or wrong when it comes to definitions and there is not only one way to define $e:=2.7182\dots$, another approach being $e=\left(\left(1+\frac{1}{n}\right)^n,\left(1+\frac{1}{n+1}\right)^{n+1}\right)_{n\in\mathbb N}$ via nested intervalls whereas $\exp$ can also be defined as the unique solution to $y'=y$ and $y(0)=1$ and some more. The trouble with different definitions for the same thing is, that one has to prove they are equivalent.
For this answer I assume that one defines $e$ as the limit of the sequence $\displaystyle a_n=\left(1+\frac 1n\right)^n$ and $\displaystyle\exp(x)=\sum\limits_{k=0}^\infty \frac{x^k}{k!}$. I choose this way because for one that's how I learnt it, but I also think that this way has advantages over defining $\exp(x)=e^x$ as we can directly apply the theory of power series to get properties (being a monotonic function, derivative etc.) of the exponential function. In the following proof I will use several properties of $\exp$ such as $\displaystyle\exp(-x)=\frac{1}{\exp(x)}$ without proofing them.
We have $$e:=\lim\limits_{n\to\infty} \left(1+\frac{1}{n}\right)^n$$ and further define $$\exp:\mathbb R\rightarrow\mathbb R,~x\mapsto \exp(x):=\sum\limits_{k=0}^\infty \frac{x^k}{k!}.$$
$\exp(x)$ is well-defined because the series $\displaystyle\sum\limits_{k=0}^\infty\frac{x^k}{k!}$ is absolutely convergent for every $x\in\mathbb R$ (this can be shown via the ratio test). We want to show: $$\lim\limits_{x\to\infty} \left(1+\frac{a}{x}\right)^x=e^a=\exp(a)$$ for all $a\in\mathbb R$.
First we take a look at $\exp(n)$ with $n\in\mathbb Z$ and show:
$$\exp(n)=e^n.$$
Using the binomial theorem we first have $$\left(1+\frac 1n\right)^n=\sum\limits_{k=0}^{n}\binom{n}{k}\left(\frac{1}{n}\right)^k=\sum\limits_{k=0}^n \frac{1}{k!}\underbrace{\frac{n\cdot (n-1)\cdot \dots \cdot(n-k+1)}{n\cdot n\cdot \dots \cdot n}}_{\leq 1}\leq\sum\limits_{k=0}^n\frac{1}{k!}\leq\exp(1)$$ and thus we have $e=\lim\limits_{n\to\infty}\left(1+\frac{1}{n}\right)^n\leq\exp(1)$.
Now for $n>m$ we get $$\left(1+\frac{1}{n}\right)^n=\sum\limits_{k=0}^n\binom{n}{k}\frac{1}{n^k}>\sum\limits_{k=0}^m\binom{n}{k}\frac{1}{n^k}=\sum\limits_{k=0}^m\frac{1}{k!}\cdot 1 \cdot \left(1-\frac{1}{n}\right)\cdot\dots\cdot\left(1-\frac{k-1}{n}\right).$$
On the RHS we have $m+1$ terms with not more than $m+1$ factors, thus we can take the limit $n\to\infty$ on both sides and get: $$e\geq \sum\limits_{k=0}^m\frac{1}{k!}$$ and therefore $$e\geq\lim\limits_{m\to\infty}\sum\limits_{k=0}^m\frac{1}{k!}=\exp(1).$$ As we have $e\leq\exp(1)$ and $e\geq\exp(1)$ we conclude: $e=\exp(1)$.
Via induction we now get $e^n=\exp(n)$ for all $n\in\mathbb N$; in the inductive step we use $$\exp(n+1)=\exp(n)\cdot\exp(1)=e^n\cdot e^1=e^{n+1}.$$ For $n\in\mathbb Z,n<0$ we then use $$\exp(n)=\left(\exp(-n)\right)^{-1}=\left(e^{-n}\right)^{-1}=e^n.$$ This proves our first statment. $\square$
I included this statement and its proof as it needs only the functional equation $\exp(x+y)=\exp(x)\exp(y)$; using more properties of $\exp$ makes this of course much easier.
For all $x\in\mathbb R$ we have: $\displaystyle e^x=\exp(x)$.
For this we use that $\exp:\mathbb R\rightarrow (0,\infty)$ is bijective with $\ln$ as its inverse function. We further use the definition of an arbitrary exponential function: for $a>0$ the function $$\exp_a:\mathbb R\rightarrow (0,\infty),~x\mapsto a^x:=\exp(x\cdot\ln(a))$$ is well-defined.
Thus we get: $$e^x=\exp(x\cdot\ln(e))=\exp(x).$$ This proves our statement. $\square$
We now have achieved to show, that $\exp(x)=e^x$, but this only gives us $$\exp(x)=e^x=\left(\lim\limits_{n\to\infty}\left(1+\frac{1}{n}\right)\right)^x,$$ so we still have to prove that $$\lim\limits_{x\to\infty} \left(1+\frac{a}{x}\right)^x=e^a,~a\in\mathbb R.$$ For $a\in\mathbb R$ we define a function $F_a$ via $$F_a: D\rightarrow \mathbb R,~F_a(x)=x\ln\left(1+\frac{a}{x}\right)=\ln\left(\left(1+\frac{a}{x}\right)^x\right).$$ As $\frac{a}{x}\rightarrow 0$ for $x\to\infty$ we can choose $D=(\alpha,\infty)\subseteq (0,\infty)$ and $F_a$ is well-defined. Thus we get: $$\left(1+\frac{a}{x}\right)^{x}=e^{F(x)}.$$
We write $\displaystyle F(x)=\frac{\ln\left(1+\frac{a}{x}\right)}{\frac{1}{x}}$ and with $\lim\limits_{x\to\infty}\ln\left(1+\frac{a}{x}\right)=\ln(0)=1=\lim\limits_{x\to\infty}\frac{1}{x}$ and $\displaystyle\frac{d}{dx}\frac 1x=-\frac{1}{x^2}\neq 0$ for $x\in D$ we apply L'Hospital's rule: $$\lim\limits_{x\to\infty} F(x)=\lim\limits_{x\to\infty} \frac{a}{1+\frac{a}{x}}=0.$$ As $\exp$ is continuous we finally get: $$\lim\limits_{x\to\infty}\left(1+\frac{a}{x}\right)^x=\exp\left(\lim\limits_{x\to\infty}F(x)\right)=e^a. \blacksquare$$
A few more words on the definition of $\exp$ as the unique solution of $y'=y,~y(0)=1$:
Let $I\subseteq\mathbb R$ be an intervall and $f:I\rightarrow\mathbb R$. Then the following holds:
$f$ is differentiable with $f'=\alpha f,~\alpha\in\mathbb R$ if and only if there exists $c\in\mathbb R$ with $f(x)=ce^{\alpha x}$ for all $x\in I$.
$"\Leftarrow"$ If $f(x)=ce^{\alpha x}$ we obviously have $f'(x)=\alpha ce^{\alpha x}=\alpha f(x)$.
$"\Rightarrow"$ Let $g:I\rightarrow\mathbb R,~x\mapsto e^{-\alpha x}f(x)$, then $g$ is differentiable with $$g'(x)=-\alpha e^{-\alpha x}f(x)+e^{-\alpha x} f'(x)=0,$$ thus there exists $c\in\mathbb R$ with $g(x)=c=e^{-\alpha x}f(x) \Leftrightarrow f(x)=ce^{\alpha x}$.
Now if $\alpha=1$ we get $c=1$ from $f(0)=c\cdot e^0=c$, thus we have proven that this definition of $\exp$ is equivalent as well.
Usually one defines $\exp x=\displaystyle \sum_{n=0}^\infty \dfrac{x^n}{n!}$.
It is easy to show it converges for all $x$ (even for complex values) and it satisfies the differential equation with initial condition: $$y'=y,\quad y(0)=1.$$ Now this differential equation implies the functional relation: $$\exp(x+y)=\exp x\cdot\exp y$$ which in turn implies that for all $n\in\mathbf N$, $\;\exp(nx)=(\exp x)^n$, whence for any $\dfrac pq$, $\;\exp\Bigl(\dfrac pqx\Bigr)=\bigl(\exp x\bigr)^{\tfrac pq}$.
Now set $\mathrm e=\exp 1=\displaystyle\sum_{n=0}^{\infty}\frac1{n!}$. What precedes proves that for any rational $r$, we have: $$\exp r = \mathrm e^r,$$ and we can define $\mathrm e^x$, for $x\in\mathbf C\smallsetminus \mathbf Q$, as:
$$\mathrm e^x\overset{\text{def}}{=} \exp x\quad\text {if}\enspace x\in\mathbf C\smallsetminus \mathbf Q.$$