Is the matrix square root uniformly continuous?
The cheapest way is to write some representation of the square root in terms of functions whose continuity is obvious. Note that the square root is homogeneous of degree $1/2$, so it suffices to show that if $\|A-B\|\le 1$, then $\|A^{1/2}-B^{1/2}\|\le C$. Now consider the function $$ f(x)=\int_0^1\left[1-\frac1{1+tx}\right]t^{-3/2}\,dt $$ Making the obvious change of variable $tx=s$, we get $$ f(x)=x^{1/2}\int_0^x\frac{s}{1+s}s^{-3/2}\,ds=Kx^{1/2}-x^{1/2}\int_x^\infty\frac{s}{1+s}s^{-3/2}\,ds=Kx^{1/2}+g(x)\,. $$ Note that $|g(x)|\le 2$ for all $x>0$. Thus, $$ \|KA^{1/2}-f(A)\|\le 2 $$ for an arbitrary positive definite self-adjoint $A$. Now it will suffice to show that $f$ is "operator Lipschitz", but that is obvious since $$ f(A)-f(B)=\int_0^1(1+tA)^{-1}(B-A)(1+tB)^{-1}t^{-1/2}\,dt $$ and $\|(1+tX)^{-1}\|\le 1$ for any positive definite self-adjoint $X$ and $t\ge 0$. (The resolvent identity $X^{-1}-Y^{-1}=X^{-1}(Y-X)Y^{-1}$ has been used here, of course).
In fact, we can say much more: every $\alpha$-Holder continuous function $F$ is operator Holder continuous ($0<\alpha<1$) on the space of self-adjoint matrices. The Lipschitz case is more subtle, however, and is not fully resolved yet. I know neither how curious you are about all this stuff, nor how much you know already, so I'm stopping here.
$\newcommand{\id}{\operatorname{Id}}$
This is merely a more detailed version of fedja's answer:
Lemma 1:
Let $f$ be a real function defined on the positive reals. Assume $|f(x)| \le C$ for every $x >0$. Then $\|f(A)\|_{op} \le C$ for every $A \in \operatorname{Psym}_n$.
Proof:
First, we note that $f$ can be extended to the cone of symmetric positive definite matrices, since their eigenvalues are strictly positive. It is enough to prove the statement for diagonal positive definite matrices:
Let $A=\operatorname{diag}(\sigma_1,...,\sigma_n)$. Then:
$$ f(A)=\operatorname{diag}(f(\sigma_1),...,f(\sigma_n)),$$ thus
$$ \|f(A)\|_{op} = \max(|f(\sigma_i)|) \le C.$$
Lemma 2
It is enough to prove that $\|A-B\|_{op} \le 1 \Rightarrow \|A^{1/2}-B^{1/2}\|_{op}\le C$.
Proof:
Indeed, let $A,B \in \operatorname{Psym}_n$. Define $\lambda=\|A-B\|$, and let $\tilde A = \frac{1}{\lambda}A, \tilde B= \frac{1}{\lambda}B$. Note that $\sqrt{\tilde A}=\frac{1}{\sqrt \lambda} \sqrt{A},\sqrt{\tilde B}=\frac{1}{\sqrt \lambda} \sqrt{B}$, and that $\|\tilde A- \tilde B\|=1$. Thus, by our assumption,
$$ \frac{1}{\sqrt{\lambda}}\|A^{1/2}-B^{1/2}\| = \|\tilde A^{1/2}-\tilde B^{1/2}\|\le C$$
Thus, $$ \|A^{1/2}-B^{1/2}\| \le C \|A-B\|^{\frac{1}{2}}$$
So, the matrix square root is $\frac{1}{2}$-Holder on $\operatorname{Psym}_n$ and in particular uniformly continuous.
$$ f(x)=\int_0^1\left[1-\frac1{1+tx}\right]t^{-3/2}\,dt $$ Making the obvious change of variable $tx=s$, we get $$ f(x)=x^{1/2}\int_0^x\frac{s}{1+s}s^{-3/2}\,ds=x^{1/2}(\int_0^\infty\frac{s}{1+s}s^{-3/2}\,ds-\int_x^\infty\frac{s}{1+s}s^{-3/2}\,ds)=Kx^{1/2}+g(x)\,. $$
where $K= \int_0^\infty\frac{s}{1+s}s^{-3/2}\,ds=\int_0^1\frac{s}{1+s}s^{-3/2}\,ds+\int_1^\infty\frac{s}{1+s}s^{-3/2}\,ds$
We already know that the first expression is finite, and the second is not greater than $\int_1^\infty s^{-3/2}\,ds < \infty$. Thus, $K < \infty$.
Since $g(x)=-x^\frac{1}{2}\int_x^\infty\frac{s}{1+s}s^{-3/2}\,ds$, $$|g(x)|\le x^\frac{1}{2}\int_x^\infty s^{-3/2}\,ds =2$$ for all $x>0$. Thus, by Lemma 1 $$ (**) \, \, \|KA^{1/2}-f(A)\|_{op}=\|g(A)\|_{op}\le 2 $$ for an arbitrary positive definite self-adjoint $A$. Now it will suffice to show that $f$ is "operator Lipschitz", i.e $\|f(A)-f(B)\| \le \tilde C\|A-B\|_{op}$.
Indeed, this would imply
$$ \|A^{\frac{1}{2}}-B^{\frac{1}{2}}\|_{op}\le \|A^{\frac{1}{2}}-\frac{1}{K}f(A)\|_{op} + \|\frac{1}{K}f(A)-\frac{1}{K}f(B)\|_{op} + \|\frac{1}{K}f(B)-B^{\frac{1}{2}}\|_{op} $$ $$ \le \frac{4}{K}+\frac{\tilde C}{K} \|A-B\|_{op}.$$
The last inequality holds for any $A,B \in \operatorname{Psym}_n$. Assuming $\|A-B\|_{op} \le 1$, it becomes:
$$ \|A^{\frac{1}{2}}-B^{\frac{1}{2}}\|_{op}\le \frac{4}{K}+\frac{\tilde C}{K}:=C $$
This finishes the proof, according to lemma 2.
We now turn to prove Lipschitzity of $f$:
First, note that integration and matrix operation commute. Thus,
$$ f(A)=\int_0^1 \id-(\id+tA)^{-1}t^{-3/2}\,dt,$$ so $$ f(A)-f(B)=\int_0^1 \left[(\id+tB)^{-1}-(\id+tA)^{-1}\right]t^{-3/2}\,dt=\int_0^1(\id+tB)^{-1}(A-B)(\id+tA)^{-1}t^{-1/2}\,dt $$
(where in the last passage we have used the resolvent identity $X^{-1}-Y^{-1}=X^{-1}(Y-X)Y^{-1}$).
Finally, we get
$$ \|f(A)-f(B)\|_{op} =\| \int_0^1(\id+tB)^{-1}(A-B)(\id+tA)^{-1}t^{-1/2}\,dt \|_{op} $$ $$\le \int_0^1 \|(\id+tB)^{-1}(A-B)(\id+tA)^{-1}t^{-1/2}\|_{op}\,dt$$ $$ \le \int_0^1 \|(\id+tB)^{-1}\|_{op}\|A-B\|_{op}\|(\id+tA)^{-1}\|_{op}t^{-1/2}\,dt \le \|A-B\|_{op} \int_0^1 t^{-1/2} \, dt =2\|A-B\|_{op} $$
(since $\|(1+tX)^{-1}\|\le 1$ for any positive definite self-adjoint $X$ and $t\ge 0$. This can be proved easily for diagonal matrices, and then using orthogonal diagonalization to all positive matrices).