Javascript ES6 computational/time complexity of collections
For anyone who is curious, I did a very quick benchmark:
const benchmarkMap = size => {
console.time('benchmarkMap');
var map = new Map();
for (var i = 0; i < size; i++) map.set(i, i);
for (var i = 0; i < size; i++) var x = map.get(i);
console.timeEnd('benchmarkMap');
}
const benchmarkObj = size => {
console.time('benchmarkObj');
var obj = {};
for (var i = 0; i < size; i++) obj[i] = i;
for (var i = 0; i < size; i++) var x = obj[i];
console.timeEnd('benchmarkObj');
}
var size = 1000000;
benchmarkMap(size);
benchmarkObj(size);
I ran this a few times and yielded the following results:
(2017 MacBook Pro, 2.5 GHz i7 w/ 16G RAM)
benchmarkMap: 189.120ms
benchmarkObj: 44.214ms
benchmarkMap: 200.817ms
benchmarkObj: 38.963ms
benchmarkMap: 187.968ms
benchmarkObj: 41.633ms
benchmarkMap: 186.533ms
benchmarkObj: 35.850ms
benchmarkMap: 187.339ms
benchmarkObj: 44.515ms
Right from that very paragraph your linked to:
Set objects must be implemented using [mechanisms] that, on average, provide access times that are sublinear on the number of elements in the collection.
You will find the same sentence for Maps, WeakMaps and WeakSets.
It looks the ECMA spec mandates that the implementations (e.g. Set.prototype.has) are to use a linear time (
O(n)
) algorithm.
No:
The data structures used in this
Set
objects specification is only intended to describe the required observable semantics of Set objects. It is not intended to be a viable implementation model.
The observable semantics are mostly related to the predictable iteration order (which still can be implemented efficient and fast). It is indeed expected by the specification that an implementation uses a hash table or something similar with constant access, though trees (with logarithmic access complexity) are allowed as well.