What does the M stand for in C# Decimal literal notation?
It means it's a decimal literal, as others have said. However, the origins are probably not those suggested elsewhere in this answer. From the C# Annotated Standard (the ECMA version, not the MS version):
The
decimal
suffix is M/m since D/d was already taken bydouble
. Although it has been suggested that M stands for money, Peter Golde recalls that M was chosen simply as the next best letter indecimal
.
A similar annotation mentions that early versions of C# included "Y" and "S" for byte
and short
literals respectively. They were dropped on the grounds of not being useful very often.
From C# specifications:
var f = 0f; // float
var d = 0d; // double
var m = 0m; // decimal (money)
var u = 0u; // unsigned int
var l = 0l; // long
var ul = 0ul; // unsigned long
Note that you can use an uppercase or lowercase notation.
M refers to the first non-ambiguous character in "decimal". If you don't add it the number will be treated as a double.
D is double.