Is $1 = [1] = [[1]] = [[[ 1 ]]], \ldots$?
The question has evolved since my initial answer, so now I offer three:
Answer #1, to the question "Is there a difference between $3$ and $[3]$?":
The answer is yes.
$3$ is the number three.
$[3]$ is a nice wooden box, lined with velvet, holding the number three.
$[[3]]$ is a nice wooden box, lined with velvet, containing a smaller wooden box.
We have $2+3=5$, because we know how to add two numbers. We cannot calculate $[2]+3$, as we do not know how to add a number to a nice wooden box. It turns out that $[2]+[3]=[5]$, because we have special rules for adding two wooden boxes.
Answer #2, to the question "Can we treat $3$ and $[3]$ as the same?"
We can, except that this will not be standard mathematics anymore. Other answers (and comments) have pointed out things that break. Not everything breaks, true.
We could, similarly, treat $3$ and $7$ as the same. This doesn't break all of mathematics, e.g. $2+2=4$ remains true; however, now $7-3=0$, and all sorts of craziness flows out. Following this to its natural conclusion ends up breaking a lot. To avoid this, we could just throw out the parts that break, leaving a new type of mathematics (considerably smaller now) in which $3=7$. This brings us to the third question and answer.
Answer #3, to the question "Should we treat $3$ and $[3]$ as the same?"
The answer is no -- we should not do this -- unless we have a clear benefit from doing so. Even if there is such a benefit, it would have to be weighed against the cost of what we give up with this new mathematics.
The only benefits I personally see are (a) a certain clarity of understanding, and (b) a certain simplicity in writing software implementations. However, (a) is a very small benefit for the price of setting 3=7, and in my view the clarity is actually categorical confusion. As for (b), it is similarly only an illusion; all that is gained is the ability to generate crashed programs very easily.
The ring of $1 \times 1$ real matrices is indeed isomorphic to the real numbers, but is not "the same thing". That said, it's often convenient and rarely confusing to identify them. For example, you can think of the dot product of two vectors (which is a real number) as the matrix product of a $1 \times n$ matrix with an $n \times 1$ matrix, which is a $1 \times 1$ matrix.
That's ultimately a matter of convention. Many people who work professionally in numerical linear algebra think in those terms, and the world still survives. It's just a slight abuse of notation that is typically harmless. As you correctly identify, you have to be careful with scalar * matrix, and it's better to "flatten" block matrices: $$ \begin{bmatrix} \begin{bmatrix} 1\\2 \end{bmatrix}\\ \begin{bmatrix} 3\\4 \end{bmatrix} \end{bmatrix} = \begin{bmatrix} 1\\2\\3\\4 \end{bmatrix}. $$
For instance, note that in Matlab (probably the most popular language for matrix computations) scalars are automatically promoted to matrices: the lines starting with >>
are the ones I typed in, the rest is output
>> [2] == 2
ans =
logical
1
>> [[[2;4]]] + [[2;4]]
ans =
4
8
>> A=3
A =
3
>> A(1)
ans =
3
>> B = [[3;4]]
B =
3
4
>> B(1)
ans =
3
>> C = [1 2; 3 4]
C =
1 2
3 4
>> 2*C
ans =
2 4
6 8
>> C+2
ans =
3 4
5 6
>> B + 2
ans =
5
6
>> A = ones(0,2)
A =
0×2 empty double matrix
>> B = ones(2, 1)
B =
1
1
>> A*B
ans =
0×1 empty double column vector