Are there continuous functions who are the same in an interval but differ in at least one other point?
Define the real functions $f$ and $g$ thus: $$ f(x) = \begin{cases} \exp\Big(-\frac{1}{(x - 1)^2}\Big)\ &\text{if } x > 1 \\ 0\ &\text{if } x \in [-1, 1] \\ \exp\Big(-\frac{1}{(x + 1)^2}\Big)\ &\text{if } x < -1 \end{cases} $$ and $$ g(x) = 0\quad \text{for all } x \in \mathbb{R} $$
By construction, $f$ and $g$ are both $0$ on $[-1, 1]$ but they differ in value everywhere else.
Obviously $g$ is continuously differentiable infinitely many times as it is a constant function.
You can also check that $f$ is continuously differentiable infinitely many times at $x = -1$ and $x = 1$ by computing appropriate right and left hand limits of $\frac{d^nf(x)}{dx^n}$ ($n \in \mathbb{Z}_+$) inductively. Let us compute for example $\lim\limits_{x \to 1^+}\frac{df(x)}{dx}$: for $x > 1$, \begin{align*} 0 < \frac{df(x)}{dx} &= \frac{2(x - 1)^{-3}}{\exp\big(\frac{1}{(x - 1)^2}\big)} \\ &= \frac{2y^3}{\exp\big(y^2\big)} \quad\text{ where }y = \frac{1}{x - 1} > 0 \\ &= \frac{2}{\frac{1}{y^3}\sum\limits_{k = 0}^\infty \frac{y^{2k}}{k!}} \quad\text{ divide by }y^3\text{ and use }e^z = \sum\limits_{k = 0}^\infty\frac{z^k}{k!} \\ &= \frac{2}{\frac{1}{y^3}\frac{y^0}{0!} + \frac{1}{y^3}\frac{y^2}{1!} + \frac{1}{y^3}\frac{y^4}{2!} + \sum\limits_{k = 3}^\infty \frac{y^{2k - 3}}{k!}} \\ &< \frac{2}{\frac{1}{y^3} + \frac{1}{y} + \frac{1}{2}y} \end{align*} As $x \to 1^+$, we get $y = \frac{1}{x - 1} \to \infty$ and the right hand fraction $\frac{2}{\frac{1}{y^3} + \frac{1}{y} + \frac{1}{2}y} \to 0$. So, as $x \to 1^+$, $\frac{df(x)}{dx} \to 0$ also by the Squeeze Theorem.
That was a long calculation but take my word: it can be repeated inductively to show that $\lim\limits_{x \to 1+}\frac{d^nf(x)}{dx^n} = 0$ for all $n \in \mathbb{Z}_+!$ At all other points i.e. on $(-\infty, -1) \cup (-1, 1) \cup (1, \infty)$, $f$ is infinitely differentiable because exponentials and constant functions are infinitely differentiable.
Bonus Fact:
Both $\frac{d^n f(x)}{dx^n}$ and $\frac{d^n g(x)}{dx^n}$ also have the same value $0$ on $[-1, 1]$ for all positive integers $n$!
Sure. In fact, there is a whole class of functions which not only exist, but are specifically made to do something that effectively implies exactly what you are looking for: they are called bump functions, and are defined as smooth (differentiable everywhere, arbitrarily many times) functions that have compact support, meaning (almost) that they are zero everywhere except on a compact set, which for the real numbers as domain basically means a closed, bounded (i.e. contained within an interval) subset thereof, such as a closed interval. This compact set where they are nonzero is called the "support". The trick is to exploit the "everywhere else zero"-ness, as that gives you what you're after.
Namely, any two different (i.e. not equal) bump functions with the same supporting interval $[a, b]$, will be smooth, zero on any interval outside this interval, and yet different, because they differ on such interval. More generally, given any two different bump functions, period, you just have to find an interval outside both of their support sets, which is always possible because they are both bounded.
A simple example of such a bump function is
$$\mathrm{bump}: \mathbb{R} \rightarrow \mathbb{R},\ \ \ \ \mathrm{bump}(x) := \begin{cases} e^{-\frac{1}{1 - x^2}},\ \mbox{when $x$ is in $(-1, 1)$}\\ 0,\ \mbox{otherwise} \end{cases}$$
Then consider just $\mathrm{bump}(x)$ and a nonzero multiple thereof, say, $\mathrm{bump2}(x) := 2 \cdot \mathrm{bump}(x)$. We now have $\mathrm{bump}(x) = \mathrm{bump2}(x)$ when, say, $x \in [10, 11]$, since they are both zero there. Yet, they are ostensibly not equal when $x$ is in $(-1, 1)$.
ADD: It appears to have been asked as to how one can do this without explicitly constructing the bump function. The above is just to show (not completely thoroughly) that bump functions exist. Indeed, we can do so as well. Let now $\mathrm{bump}(x)$ be a general bump function. Let its support set be $\mathrm{supp}[\mathrm{bump}]$. That is,
$$\mathrm{supp}[\mathrm{bump}] := \mathrm{cl}\left(\{ x \in \mathbb{R} : \mathrm{bump}(x) \ne 0 \}\right)$$
(n.b. "cl" means to take the closure; basically this includes all "endpoints" of regions in which it is nonzero, even if it is zero at those endpoints - e.g. the support of the just-given-explicitly bump function is $[-1, 1]$, not $(-1, 1)$. This is a bit of technicality that was wrapped earlier when I said "almost" in "meaning (almost)" above.)
Since the support set is bounded and closed, it has a maximum and minimum (largest and smallest element): assign $M := \mathrm{max}\ \mathrm{supp}[\mathrm{bump}]$. Now consider the interval $\mathrm{ext\ ival} := [M+1, M+2]$. If $x \in \mathrm{ext\ ival}$, then it is clearly not in the support set, but rather to the right of it. Thus $\mathrm{bump}(x) = 0$ there. Now set $\mathrm{bump2}(x) := 2 \cdot \mathrm{bump}(x)$ as before (if you want even more generality, just replace $2$ with an arbitrary vertical rescaling coefficient $a$ that is not $0$ or $1$). Congrats, you now have two bump functions that are unequal but equal on the external interval $\mathrm{ext\ ival}$.