How do I get the decimal places of a floating point number in Javascript?

For anyone wondering how to do this faster (without converting to string), here's a solution:

function precision(a) {
  var e = 1;
  while (Math.round(a * e) / e !== a) e *= 10;
  return Math.log(e) / Math.LN10;
}

Edit: a more complete solution with edge cases covered:

function precision(a) {
  if (!isFinite(a)) return 0;
  var e = 1, p = 0;
  while (Math.round(a * e) / e !== a) { e *= 10; p++; }
  return p;
}

One possible solution (depends on the application):

var precision = (12.3456 + "").split(".")[1].length;

If by "precision" you mean "decimal places", then that's impossible because floats are binary. They don't have decimal places, and most values that have a small number of decimal places have recurring digits in binary, and when they're translated back to decimal that doesn't necessarily yield the original decimal number.

Any code that works with the "decimal places" of a float is liable to produce unexpected results on some numbers.


There is no native function to determine the number of decimals. What you can do is convert the number to string and then count the offset off the decimal delimiter .:

Number.prototype.getPrecision = function() {
    var s = this + "",
        d = s.indexOf('.') + 1;

    return !d ? 0 : s.length - d;
};

(123).getPrecision() === 0;
(123.0).getPrecision() === 0;
(123.12345).getPrecision() === 5;
(1e3).getPrecision() === 0;
(1e-3).getPrecision() === 3;

But it's in the nature of floats to fool you. 1 may just as well be represented by 0.00000000989 or something. I'm not sure how well the above actually performs in real life applications.