Is Java's BigDecimal the closest data type corresponding to C#'s Decimal?

Is this really so?

They are similar but not identical. To be more specific: the Java version can represent every value that the C# version can, but the opposite is not true.

What's up with the "Big" preamble?

A Java BigDecimal can have arbitrarily much precision and therefore can be arbitrarily large. If you want to make a BigDecimal with a thousand places of precision, you go right ahead.

By contrast, a C# decimal has a fixed size; it takes up 128 bits and gives you 28 decimal places of precision.

To be more precise: both types give you numbers of the form

+/- someInteger / 10 ^ someExponent

In C#, someInteger is a 96 bit unsigned integer and someExponent is an integer between 0 and 28.

In Java, someInteger is of arbitrary size and someExponent is a signed 32 bit integer.


Yep - that's the corresponding type.

Since you are using Java after C# - don't be too surprised to find little nuances like this - or be too upset when there is no easy way to do something that's "easy" to do C#. The first thing that comes to my mind is int & int? - in Java you just use int and Integer.

C# had the luxury of coming after Java so lots of (what I subjectively see as) bad decisions have been fixed/streamlined. Also, it helps that C# was designed by Andres Hejlsberg (who is arguably one of the best programming language designers alive) and is regularly "updated" unlike Java (you probably witnessed all things added to C# since 2000 - complete list)