node.js splice too slow for more than 70000 items
V8 developer here. Removing (or inserting) array elements at the start (at array[0]
) is generally very expensive, because all the remaining elements have to be moved. Essentially, what the engine has to do under the hood for every one of these .splice(0, 1)
operations is:
for (var j = 0; j < a.length - 1; j++) {
a[j] = a[j+1];
}
a.length = a.length - 1`
In some cases, V8 can employ a trick under the hood where the beginning of the object is moved instead -- in the fast cases, you can see the amazing speedup that this trick provides. However, for technical reasons, this trick cannot be applied for arrays beyond a certain size. The resulting "slowdown" is actually the "true" speed of this very expensive operation.
If you want to delete array elements fast, delete them from the end (at array[array.length - 1]
), e.g. using Array.pop()
. If you want to delete all elements in one go, just set array.length = 0
. If you need fast FIFO/"queue" semantics, consider taking inspiration from ring buffers: have a "cursor" for the next element to be read/returned, and only shrink the array when there's a big chunk of elements to be freed up. Roughly:
function Queue() {
this.enqueue = function(x) {
this.array_.push(x);
}
this.dequeue = function() {
var x = this.array_[this.cursor_++];
// Free up space if half the array is unused.
if (this.cursor_ > this.array_.length / 2) {
this.array_.splice(0, this.cursor_);
this.cursor_ = 0;
}
return x;
}
this.array_ = [];
this.cursor_ = 0;
}
Side note: It doesn't matter here, but for the record, to push 70,000 elements into your array, your loop should start at 0: for (var i = 0; i < 70000; i++) {...}
. As written, you're only pushing 69,999 elements.
Side note 2: Rounding a double to an integer via "parseInt" is pretty slow, because it first formats the double as a string, then reads that string back as an integer. A faster way would be Math.floor(Math.random() * 10000))
. (For the purposes of this test, you could also simply push i
.)