Why can JavaScript handle timestamps beyond 2038?
32bit PHP uses 32bit integers, whose max value puts the last UNIX timestamp that can be expressed by them in 2038. That's widely known as the Y2K38 problem and affects virtually all 32bit software using UNIX timestamps. Moving to 64bits or libraries which work with other timestamp representations (in the case of PHP the DateTime
class) solves this problem.
Javascript doesn't have integers but only floats, which don't have an inherent maximum value (but in return have less precision).
It can. Try out new Date(8640000000000000)
Sat Sep 13 275760 03:00:00 GMT+0300 (Eastern European Summer Time)
Year 275760 is is a bit beyond 2038 :)
Read the spec section 15.9.1.1
http://ecma-international.org/ecma-262/5.1/#sec-15.9.1.1
A Date object contains a Number indicating a particular instant in time to within a millisecond. Such a Number is called a time value. A time value may also be NaN, indicating that the Date object does not represent a specific instant of time.
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. In time values leap seconds are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day. ECMAScript Number values can represent all integers from –9,007,199,254,740,992 to 9,007,199,254,740,992; this range suffices to measure times to millisecond precision for any instant that is within approximately 285,616 years, either forward or backward, from 01 January, 1970 UTC.
The actual range of times supported by ECMAScript Date objects is slightly smaller: exactly –100,000,000 days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.
The exact moment of midnight at the beginning of 01 January, 1970 UTC is represented by the value +0.
This implies that JS uses UNIX timestamp.
Just a sidenote: Unix timestamp are seconds since 1970. JS time is milliseconds since 1970. So JS timestamp does not fit in a 32 bit int much earlier (but JS does not use 32 bit int for this)
Javascript doesn't have integer numbers, only floating point numbers (details can be found in the standards document).
That means that you can represent some really large numbers, but at the cost of precision. A simple test is this:
i = 1384440291042
=> 1384440291042
i = 13844402910429
=> 13844402910429
i = 138444029104299
=> 138444029104299
i = 1384440291042999
=> 1384440291042999
i = 13844402910429999
=> 13844402910430000
i = 138444029104299999
=> 138444029104300000
i = 1384440291042999999
=> 1384440291043000000
i = 13844402910429999999
=> 13844402910430000000
As you can see the number is not guaranteed to be kept exact. The outer limits of integer precision in javascript (where you will actually get back the same value you put in) is 9007199254740992. That would be good up until 285428751-11-12T07:36:32+00:00 according to my conversion test :)
The simple answer is that Javascript internally uses a larger data type than the longint (4 bytes, 32bit) that is used for the C style epoc ...