Concatenate two (or n) streams
If you don't care about the ordering of data in the streams, a simple reduce operation should be fine in nodejs!
const {PassThrough} = require('stream')
let joined = [s0, s1, s2, ...sN].reduce((pt, s, i, a) => {
s.pipe(pt, {end: false})
s.once('end', () => a.every(s => s.ended) && pt.emit('end'))
return pt
}, new PassThrough())
Cheers ;)
this can be done with vanilla nodejs
import { PassThrough } from 'stream'
const merge = (...streams) => {
let pass = new PassThrough()
let waiting = streams.length
for (let stream of streams) {
pass = stream.pipe(pass, {end: false})
stream.once('end', () => --waiting === 0 && pass.emit('end'))
}
return pass
}
The combined-stream package concatenates streams. Example from the README:
var CombinedStream = require('combined-stream');
var fs = require('fs');
var combinedStream = CombinedStream.create();
combinedStream.append(fs.createReadStream('file1.txt'));
combinedStream.append(fs.createReadStream('file2.txt'));
combinedStream.pipe(fs.createWriteStream('combined.txt'));
I believe you have to append all streams at once. If the queue runs empty, the combinedStream
automatically ends. See issue #5.
The stream-stream library is an alternative that has an explicit .end
, but it's much less popular and presumably not as well-tested. It uses the streams2 API of Node 0.10 (see this discussion).
This can now be easily done using async iterators
async function* concatStreams(readables) {
for (const readable of readables) {
for await (const chunk of readable) { yield chunk }
}
}
And you can use it like this
const fs = require('fs')
const stream = require('stream')
const files = ['file1.txt', 'file2.txt', 'file3.txt']
const iterable = await concatStreams(files.map(f => fs.createReadStream(f)))
// convert the async iterable to a readable stream
const mergedStream = stream.Readable.from(iterable)
More info regarding async iterators: https://2ality.com/2019/11/nodejs-streams-async-iteration.html