Top Related Projects
glob functionality for node.js
browser-side require() the node.js way
🌐 Human-friendly and powerful HTTP request library for Node.js
🏊🏾 Simplified HTTP request client.
Promise based HTTP client for the browser and node.js
node.js command-line interfaces made easy
Quick Overview
event-stream is a Node.js library that provides a powerful set of tools for working with streams. It offers a collection of functions to manipulate, transform, and combine streams, making it easier to work with asynchronous data flows in Node.js applications.
Pros
- Simplifies complex stream operations
- Lightweight and easy to use
- Provides a wide range of stream utility functions
- Compatible with Node.js built-in streams
Cons
- Lack of recent updates (last major update was in 2018)
- Some security concerns due to a past incident involving malicious code
- Limited documentation and examples
- May have performance overhead for very large data sets
Code Examples
- Mapping stream data:
const es = require('event-stream');
es.map((data, callback) => {
callback(null, data.toString().toUpperCase());
})
This example creates a stream that converts incoming data to uppercase.
- Filtering stream data:
const es = require('event-stream');
es.filter((data) => {
return data.length > 5;
})
This creates a stream that only allows data longer than 5 characters to pass through.
- Merging multiple streams:
const es = require('event-stream');
const stream1 = fs.createReadStream('file1.txt');
const stream2 = fs.createReadStream('file2.txt');
es.merge(stream1, stream2).pipe(process.stdout);
This example merges two file streams and pipes the result to stdout.
Getting Started
To use event-stream in your Node.js project:
-
Install the package:
npm install event-stream
-
Require the module in your code:
const es = require('event-stream');
-
Use the various stream utilities provided by the library:
const myStream = es.through(function write(data) { this.emit('data', data.toString().toUpperCase()); }); process.stdin.pipe(myStream).pipe(process.stdout);
This example creates a simple through stream that converts input to uppercase and pipes it to stdout.
Competitor Comparisons
glob functionality for node.js
Pros of node-glob
- More focused functionality for file and directory matching
- Extensive documentation and examples
- Actively maintained with regular updates
Cons of node-glob
- Limited to file system operations, less versatile than event-stream
- Potentially slower for large file systems due to synchronous operations
Code Comparison
node-glob:
const glob = require('glob');
glob('**/*.js', (err, files) => {
console.log(files);
});
event-stream:
const es = require('event-stream');
es.readArray([1, 2, 3, 4])
.pipe(es.map((data, callback) => {
callback(null, data * 2);
}))
.pipe(es.writeArray((err, array) => {
console.log(array); // [2, 4, 6, 8]
}));
node-glob focuses on file system pattern matching, while event-stream provides a more general-purpose streaming API. node-glob is better suited for tasks involving file and directory searches, offering a simpler API for these operations. event-stream, on the other hand, excels in handling various types of data streams and transformations.
The code examples demonstrate the different use cases: node-glob for finding files matching a pattern, and event-stream for processing data streams. While both libraries are useful, they serve different purposes and are often used in different scenarios within Node.js applications.
browser-side require() the node.js way
Pros of Browserify
- Actively maintained with regular updates and a large community
- Comprehensive documentation and extensive ecosystem of plugins
- Supports bundling of non-JavaScript assets like CSS and images
Cons of Browserify
- Steeper learning curve for beginners compared to simpler stream-based tools
- Can be slower for large projects due to its bundling process
- Requires additional configuration for advanced use cases
Code Comparison
Event-stream example:
var es = require('event-stream');
es.pipeline(
fs.createReadStream('input.txt'),
es.split(),
es.map(function (line, cb) { cb(null, line.toUpperCase()) }),
fs.createWriteStream('output.txt')
);
Browserify example:
var browserify = require('browserify');
var b = browserify('main.js');
b.bundle().pipe(fs.createWriteStream('bundle.js'));
Key Differences
- Event-stream focuses on stream manipulation, while Browserify is primarily for bundling modules
- Browserify is more complex but offers greater flexibility for web development
- Event-stream is lightweight and easier to use for simple stream operations
Use Cases
- Event-stream: Data processing, file manipulation, and simple streaming tasks
- Browserify: Web application development, module bundling, and creating browser-compatible code from Node.js modules
🌐 Human-friendly and powerful HTTP request library for Node.js
Pros of got
- More actively maintained with frequent updates and bug fixes
- Comprehensive feature set including pagination, retries, and proxy support
- Better TypeScript support with built-in type definitions
Cons of got
- Larger package size and potentially higher memory footprint
- Steeper learning curve due to more complex API and configuration options
- May be overkill for simple HTTP requests
Code comparison
event-stream:
const es = require('event-stream');
es.pipeline(
fs.createReadStream('input.txt'),
es.split(),
es.map((data, callback) => {
callback(null, data.toUpperCase());
}),
fs.createWriteStream('output.txt')
);
got:
const got = require('got');
(async () => {
const response = await got('https://api.example.com/data');
console.log(response.body);
})();
Summary
got is a more feature-rich and actively maintained HTTP client library, while event-stream is a simpler stream manipulation library. got offers more advanced functionality for HTTP requests, but may be more complex to use. event-stream is lightweight and focuses on stream operations, making it suitable for simpler data processing tasks. The choice between the two depends on the specific requirements of your project and the complexity of the operations you need to perform.
🏊🏾 Simplified HTTP request client.
Pros of request
- More comprehensive HTTP client with support for various methods and options
- Better documentation and wider community adoption
- Built-in support for authentication, cookies, and proxies
Cons of request
- Larger package size and more dependencies
- Slightly steeper learning curve for basic use cases
- Not as lightweight or stream-focused as event-stream
Code comparison
event-stream:
var es = require('event-stream');
es.pipeline(
fs.createReadStream('input.txt'),
es.split(),
es.map(function (line, cb) {
cb(null, line.toUpperCase());
}),
fs.createWriteStream('output.txt')
);
request:
const request = require('request');
request('https://api.example.com/data')
.on('error', function(err) {
console.error(err);
})
.pipe(fs.createWriteStream('output.json'));
event-stream is focused on working with streams and transforming data, while request is primarily designed for making HTTP requests and handling responses. event-stream is more lightweight and flexible for general stream manipulation, whereas request provides a higher-level API for HTTP interactions with built-in features like authentication and proxy support. The choice between the two depends on the specific requirements of your project, with event-stream being better suited for stream processing tasks and request being more appropriate for HTTP-centric applications.
Promise based HTTP client for the browser and node.js
Pros of axios
- More comprehensive HTTP client with support for both browser and Node.js
- Extensive features including request/response interceptors and automatic transforms
- Active development with frequent updates and a large community
Cons of axios
- Larger bundle size due to more features
- Steeper learning curve for beginners compared to simpler stream-based APIs
- May be overkill for simple HTTP requests or streaming operations
Code comparison
event-stream:
var es = require('event-stream');
es.pipeline(
fs.createReadStream('input.txt'),
es.split(),
es.map(function (line, cb) {
cb(null, line.toUpperCase());
}),
fs.createWriteStream('output.txt')
);
axios:
const axios = require('axios');
axios.get('https://api.example.com/data')
.then(response => {
console.log(response.data);
})
.catch(error => {
console.error(error);
});
Summary
event-stream is a lightweight, stream-based utility for working with data, while axios is a full-featured HTTP client. event-stream excels at processing data streams efficiently, making it ideal for tasks like file parsing or real-time data manipulation. axios, on the other hand, provides a more comprehensive solution for making HTTP requests, handling responses, and managing complex API interactions. The choice between the two depends on the specific requirements of your project, with event-stream being more suitable for stream processing tasks and axios for broader HTTP client needs.
node.js command-line interfaces made easy
Pros of Commander.js
- Focused on command-line interface (CLI) applications, providing a more specialized toolset
- Extensive documentation and a large, active community
- Built-in support for command-line arguments parsing and help generation
Cons of Commander.js
- Limited to CLI applications, less versatile than Event-stream
- Steeper learning curve for developers new to CLI development
- Larger package size due to its comprehensive feature set
Code Comparison
Event-stream example:
var es = require('event-stream');
es.readArray([1, 2, 3, 4])
.pipe(es.map(function (data, callback) {
callback(null, data * 2);
}))
.pipe(es.writeArray(function (err, array) {
console.log(array); // [2, 4, 6, 8]
}));
Commander.js example:
const { program } = require('commander');
program
.option('-d, --debug', 'output extra debugging')
.option('-s, --small', 'small pizza size')
.option('-p, --pizza-type <type>', 'flavour of pizza');
program.parse(process.argv);
const options = program.opts();
console.log(options);
Event-stream is a versatile streaming library for Node.js, while Commander.js specializes in building command-line interfaces. Event-stream excels in data manipulation and transformation, whereas Commander.js shines in parsing command-line arguments and generating help documentation for CLI applications.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
EventStream
Streams are node's best and most misunderstood idea, and EventStream is a toolkit to make creating and working with streams easy.
Normally, streams are only used for IO, but in event stream we send all kinds of objects down the pipe. If your application's input and output are streams, shouldn't the throughput be a stream too?
The EventStream functions resemble the array functions, because Streams are like Arrays, but laid out in time, rather than in memory.
All the event-stream
functions return instances of Stream
.
event-stream
creates 0.8 streams, which are compatible with 0.10 streams.
NOTE: I shall use the term "through stream" to refer to a stream that is writable and readable.
NOTE for Gulp users: Merge will not work for gulp 4. merge-stream should be used.
simple example:
//pretty.js
if(!module.parent) {
var es = require('event-stream')
var inspect = require('util').inspect
process.stdin //connect streams together with `pipe`
.pipe(es.split()) //split stream to break on newlines
.pipe(es.map(function (data, cb) { //turn this async function into a stream
cb(null
, inspect(JSON.parse(data))) //render it nicely
}))
.pipe(process.stdout) // pipe it to stdout !
}
run it ...
curl -sS registry.npmjs.org/event-stream | node pretty.js
through (write?, end?)
Re-emits data synchronously. Easy way to create synchronous through streams.
Pass in optional write
and end
methods. They will be called in the
context of the stream. Use this.pause()
and this.resume()
to manage flow.
Check this.paused
to see current flow state. (write always returns !this.paused
)
this function is the basis for most of the synchronous streams in event-stream
.
es.through(function write(data) {
this.emit('data', data)
//this.pause()
},
function end () { //optional
this.emit('end')
})
map (asyncFunction)
Create a through stream from an asynchronous function.
var es = require('event-stream')
es.map(function (data, callback) {
//transform data
// ...
callback(null, data)
})
Each map MUST call the callback. It may callback with data, with an error or with no arguments,
-
callback()
drop this data.
this makes the map work likefilter
,
note:callback(null,null)
is not the same, and will emitnull
-
callback(null, newData)
turn data into newData -
callback(error)
emit an error for this item.
Note: if a callback is not called,
map
will think that it is still being processed,
every call must be answered or the stream will not know when to end.Also, if the callback is called more than once, every call but the first will be ignored.
mapSync (syncFunction)
Same as map
, but the callback is called synchronously. Based on es.through
flatmapSync (syncFunction)
Map elements nested.
var es = require('event-stream')
es.flatmapSync(function (data) {
//transform data
// ...
return data
})
filterSync (syncFunction)
Filter elements.
var es = require('event-stream')
es.filterSync(function (data) {
return data > 0
})
split (matcher)
Break up a stream and reassemble it so that each line is a chunk. matcher may be a String
, or a RegExp
Example, read every line in a file ...
fs.createReadStream(file, {flags: 'r'})
.pipe(es.split())
.pipe(es.map(function (line, cb) {
//do something with the line
cb(null, line)
}))
split
takes the same arguments as string.split
except it defaults to '\n' instead of ',', and the optional limit
parameter is ignored.
String#split
NOTE - Maintaining Line Breaks
If you want to process each line of the stream, transform the data, reassemble, and KEEP the line breaks the example will look like this:
fs.createReadStream(file, {flags: 'r'})
.pipe(es.split(/(\r?\n)/))
.pipe(es.map(function (line, cb) {
//do something with the line
cb(null, line)
}))
This technique is mentioned in the underlying documentation for the split npm package.
join (separator)
Create a through stream that emits separator
between each chunk, just like Array#join.
(for legacy reasons, if you pass a callback instead of a string, join is a synonym for es.wait
)
merge (stream1,...,streamN) or merge (streamArray)
concat â merge
Merges streams into one and returns it.
Incoming data will be emitted as soon it comes into - no ordering will be applied (for example: data1 data1 data2 data1 data2
- where data1
and data2
is data from two streams).
Counts how many streams were passed to it and emits end only when all streams emitted end.
es.merge(
process.stdout,
process.stderr
).pipe(fs.createWriteStream('output.log'));
It can also take an Array of streams as input like this:
es.merge([
fs.createReadStream('input1.txt'),
fs.createReadStream('input2.txt')
]).pipe(fs.createWriteStream('output.log'));
replace (from, to)
Replace all occurrences of from
with to
. from
may be a String
or a RegExp
.
Works just like string.split(from).join(to)
, but streaming.
parse
Convenience function for parsing JSON chunks. For newline separated JSON,
use with es.split
. By default it logs parsing errors by console.error
;
for another behaviour, transforms created by es.parse({error: true})
will
emit error events for exceptions thrown from JSON.parse
, unmodified.
fs.createReadStream(filename)
.pipe(es.split()) //defaults to lines.
.pipe(es.parse())
stringify
convert javascript objects into lines of text. The text will have whitespace escaped and have a \n
appended, so it will be compatible with es.parse
objectStream
.pipe(es.stringify())
.pipe(fs.createWriteStream(filename))
readable (asyncFunction)
create a readable stream (that respects pause) from an async function.
while the stream is not paused,
the function will be polled with (count, callback)
,
and this
will be the readable stream.
es.readable(function (count, callback) {
if(streamHasEnded)
return this.emit('end')
//...
this.emit('data', data) //use this way to emit multiple chunks per call.
callback() // you MUST always call the callback eventually.
// the function will not be called again until you do this.
})
you can also pass the data and the error to the callback.
you may only call the callback once.
calling the same callback more than once will have no effect.
readArray (array)
Create a readable stream from an Array.
Just emit each item as a data event, respecting pause
and resume
.
var es = require('event-stream')
, reader = es.readArray([1,2,3])
reader.pipe(...)
If you want the stream behave like a 0.10 stream you will need to wrap it using Readable.wrap()
function. Example:
var s = new stream.Readable({objectMode: true}).wrap(es.readArray([1,2,3]));
writeArray (callback)
create a writeable stream from a callback,
all data
events are stored in an array, which is passed to the callback when the stream ends.
var es = require('event-stream')
, reader = es.readArray([1, 2, 3])
, writer = es.writeArray(function (err, array){
//array deepEqual [1, 2, 3]
})
reader.pipe(writer)
pause ()
A stream that buffers all chunks when paused.
var ps = es.pause()
ps.pause() //buffer the stream, also do not allow 'end'
ps.resume() //allow chunks through
duplex (writeStream, readStream)
Takes a writable stream and a readable stream and makes them appear as a readable writable stream.
It is assumed that the two streams are connected to each other in some way.
(This is used by pipeline
and child
.)
var grep = cp.exec('grep Stream')
es.duplex(grep.stdin, grep.stdout)
child (child_process)
Create a through stream from a child process ...
var cp = require('child_process')
es.child(cp.exec('grep Stream')) // a through stream
wait (callback)
waits for stream to emit 'end'. joins chunks of a stream into a single string or buffer. takes an optional callback, which will be passed the complete string/buffer when it receives the 'end' event.
also, emits a single 'data' event.
readStream.pipe(es.wait(function (err, body) {
// have complete text here.
}))
Other Stream Modules
These modules are not included as a part of EventStream but may be useful when working with streams.
reduce (syncFunction, initial)
Like Array.prototype.reduce
but for streams. Given a sync reduce
function and an initial value it will return a through stream that emits
a single data event with the reduced value once the input stream ends.
var reduce = require("stream-reduce");
process.stdin.pipe(reduce(function(acc, data) {
return acc + data.length;
}, 0)).on("data", function(length) {
console.log("stdin size:", length);
});
Top Related Projects
glob functionality for node.js
browser-side require() the node.js way
🌐 Human-friendly and powerful HTTP request library for Node.js
🏊🏾 Simplified HTTP request client.
Promise based HTTP client for the browser and node.js
node.js command-line interfaces made easy
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot