Top Related Projects
Promise based HTTP client for the browser and node.js
🏊🏾 Simplified HTTP request client.
🌐 Human-friendly and powerful HTTP request library for Node.js
Ajax for Node.js and browsers (JS HTTP client). Maintained for @forwardemail, @ladjs, @spamscanner, @breejs, @cabinjs, and @lassjs.
A light-weight module that brings the Fetch API to Node.js
Functional JS HTTP client (Node.js & Fetch) w/ async await
Quick Overview
Undici is a high-performance HTTP/1.1, HTTP/2, and HTTP/3 client for Node.js. It is designed to be a fast and efficient alternative to the built-in http
and https
modules in Node.js, with a focus on simplicity and ease of use.
Pros
- High Performance: Undici is built on top of the
llhttp
library, which provides a fast and efficient HTTP parser, resulting in improved performance compared to the built-inhttp
andhttps
modules. - Simplicity: Undici has a straightforward and intuitive API, making it easy to use and integrate into existing projects.
- Flexibility: Undici supports HTTP/1.1, HTTP/2, and HTTP/3, allowing developers to choose the most appropriate protocol for their needs.
- Actively Maintained: The Undici project is actively maintained by a team of experienced developers, ensuring that it stays up-to-date and secure.
Cons
- Limited Ecosystem: Undici is a relatively new library, and as a result, the ecosystem of third-party modules and tools may be smaller compared to more established HTTP clients.
- Learning Curve: While the API is straightforward, developers who are used to the built-in
http
andhttps
modules may need to invest some time in learning the Undici API. - Dependency on
llhttp
: Undici's performance is heavily dependent on thellhttp
library, which may introduce potential issues or limitations if there are any problems withllhttp
. - Lack of Widespread Adoption: Undici is not as widely adopted as some other HTTP client libraries, which may make it harder to find community support and resources.
Code Examples
Here are a few examples of how to use Undici in your Node.js projects:
Making a Simple GET Request:
const { request } = require('undici');
async function fetchData() {
const { statusCode, body } = await request('https://api.example.com/data');
console.log(`Status Code: ${statusCode}`);
console.log(`Response Body: ${await body.text()}`);
}
fetchData();
Sending a POST Request with JSON Data:
const { request } = require('undici');
async function postData() {
const { statusCode, body } = await request('https://api.example.com/data', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ name: 'John Doe', email: 'john@example.com' }),
});
console.log(`Status Code: ${statusCode}`);
console.log(`Response Body: ${await body.text()}`);
}
postData();
Handling Redirects:
const { request } = require('undici');
async function followRedirects() {
const { statusCode, body, headers } = await request('https://example.com', {
redirect: 'manual',
});
if (statusCode >= 300 && statusCode < 400 && headers.location) {
const { statusCode: finalStatusCode, body: finalBody } = await request(headers.location, {
redirect: 'follow',
});
console.log(`Final Status Code: ${finalStatusCode}`);
console.log(`Final Response Body: ${await finalBody.text()}`);
} else {
console.log(`Status Code: ${statusCode}`);
console.log(`Response Body: ${await body.text()}`);
}
}
followRedirects();
Getting Started
To get started with Undici, you can install it using npm:
npm install undici
Once installed, you can import the request
function from the undici
module and use it to make HTTP requests. Here's a simple example:
const { request } = require('undici');
async function fetchData() {
const { statusCode, body } = await request('https://api.example.com/data');
console.log(`Status Code: ${statusCode}`);
Competitor Comparisons
Promise based HTTP client for the browser and node.js
Pros of Axios
- Browser support: Axios works in both Node.js and browsers, making it versatile for full-stack development
- Interceptors: Allows modifying requests or responses before they are handled by
then
orcatch
- Automatic request and response transformations, including JSON data
Cons of Axios
- Larger bundle size compared to Undici, which may impact performance in some scenarios
- Less focus on Node.js-specific optimizations, as it aims for cross-platform compatibility
- Slower development cycle and potentially longer time to adopt new Node.js features
Code Comparison
Axios:
const axios = require('axios');
axios.get('https://api.example.com/data')
.then(response => console.log(response.data))
.catch(error => console.error(error));
Undici:
const { request } = require('undici');
request('https://api.example.com/data')
.then(({ body }) => body.json())
.then(data => console.log(data))
.catch(error => console.error(error));
Both libraries provide similar functionality for making HTTP requests, but Undici is more focused on Node.js performance and modern features. Axios offers a more familiar API and cross-platform support, while Undici provides better performance and lower overhead for Node.js applications.
🏊🏾 Simplified HTTP request client.
Pros of request
- Mature and widely adopted library with extensive documentation
- Supports both promises and callbacks
- Large ecosystem of plugins and middleware
Cons of request
- No longer actively maintained (deprecated)
- Larger bundle size and slower performance compared to modern alternatives
- Lacks native support for newer Node.js features like streams
Code comparison
request:
const request = require('request');
request('https://api.example.com', (error, response, body) => {
if (!error && response.statusCode == 200) {
console.log(body);
}
});
undici:
const { request } = require('undici');
const { statusCode, body } = await request('https://api.example.com');
console.log(await body.text());
Key differences
- undici is actively maintained and optimized for modern Node.js environments
- undici offers better performance and smaller bundle size
- undici uses native Promises and supports async/await syntax out of the box
- request has a simpler API for basic use cases but lacks modern features
- undici provides more low-level control over HTTP requests and connections
Both libraries serve similar purposes, but undici is generally recommended for new projects due to its active development, performance benefits, and alignment with modern JavaScript practices.
🌐 Human-friendly and powerful HTTP request library for Node.js
Pros of Got
- More feature-rich with built-in functionality like retries, pagination, and caching
- Extensive plugin ecosystem for additional capabilities
- Simpler API for common use cases, making it more beginner-friendly
Cons of Got
- Larger bundle size due to additional features
- Slightly slower performance compared to Undici
- Not as low-level or close to the HTTP specification as Undici
Code Comparison
Got:
import got from 'got';
const response = await got('https://api.example.com/data');
console.log(response.body);
Undici:
import { request } from 'undici';
const { body } = await request('https://api.example.com/data');
const data = await body.json();
console.log(data);
Summary
Got is a more feature-rich and user-friendly HTTP client, ideal for developers who need a wide range of built-in functionality and don't mind a larger bundle size. It's particularly suitable for projects that require advanced features like retries, caching, or pagination out of the box.
Undici, on the other hand, is a lightweight and performant HTTP client that closely follows the HTTP specification. It's better suited for projects that prioritize speed and small bundle size, or for developers who prefer a more low-level approach to HTTP requests.
The choice between Got and Undici depends on the specific needs of your project, balancing features and ease of use against performance and bundle size considerations.
Ajax for Node.js and browsers (JS HTTP client). Maintained for @forwardemail, @ladjs, @spamscanner, @breejs, @cabinjs, and @lassjs.
Pros of Superagent
- More mature and established project with a longer history
- Supports both Node.js and browser environments
- Extensive plugin ecosystem for additional functionality
Cons of Superagent
- Larger bundle size and more dependencies
- Slower performance compared to Undici
- Less focus on modern HTTP features and optimizations
Code Comparison
Superagent:
const superagent = require('superagent');
superagent
.get('https://api.example.com/data')
.query({ limit: 10 })
.end((err, res) => {
if (err) console.error(err);
console.log(res.body);
});
Undici:
const { request } = require('undici');
const { body } = await request('https://api.example.com/data?limit=10');
const data = await body.json();
console.log(data);
Superagent offers a more chainable API with a callback-style approach, while Undici provides a more modern, Promise-based interface with better performance. Undici is designed specifically for Node.js and focuses on HTTP/1.1 and HTTP/2 optimizations, making it more efficient for server-side applications. However, Superagent's broader compatibility and extensive plugin system may be advantageous for projects requiring cross-platform support or specific middleware functionality.
A light-weight module that brings the Fetch API to Node.js
Pros of node-fetch
- Simpler API, closely mimicking the browser's Fetch API
- Wider browser compatibility and easier to use in isomorphic applications
- More established and mature project with a larger ecosystem of plugins and extensions
Cons of node-fetch
- Generally slower performance compared to Undici
- Less actively maintained, with fewer recent updates and improvements
- Lacks some advanced features present in Undici, such as connection pooling and HTTP/2 support
Code Comparison
node-fetch:
import fetch from 'node-fetch';
const response = await fetch('https://api.example.com/data');
const data = await response.json();
Undici:
import { request } from 'undici';
const { body } = await request('https://api.example.com/data');
const data = await body.json();
Summary
While node-fetch offers a familiar API and broader compatibility, Undici provides better performance and more advanced features. node-fetch is suitable for simpler applications or those requiring isomorphic code, while Undici is ideal for high-performance Node.js applications. The choice between the two depends on specific project requirements, performance needs, and developer preferences.
Functional JS HTTP client (Node.js & Fetch) w/ async await
Pros of bent
- Simpler API with a more straightforward approach to making HTTP requests
- Built-in support for JSON parsing and automatic retries
- Lightweight with minimal dependencies
Cons of bent
- Less actively maintained compared to Undici
- Limited feature set and customization options
- May not perform as well as Undici in high-throughput scenarios
Code Comparison
bent:
const bent = require('bent')
const getJSON = bent('json')
const response = await getJSON('https://api.example.com/data')
Undici:
const { request } = require('undici')
const { body } = await request('https://api.example.com/data')
const response = await body.json()
Summary
bent offers a simpler API and built-in features like JSON parsing and retries, making it easier to use for basic HTTP requests. However, Undici provides better performance, more active maintenance, and a wider range of features for advanced use cases. The choice between the two depends on the specific requirements of your project, with bent being suitable for simpler applications and Undici excelling in high-performance scenarios.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
undici
An HTTP/1.1 client, written from scratch for Node.js.
Undici means eleven in Italian. 1.1 -> 11 -> Eleven -> Undici. It is also a Stranger Things reference.
How to get involved
Have a question about using Undici? Open a Q&A Discussion or join our official OpenJS Slack channel.
Looking to contribute? Start by reading the contributing guide
Install
npm i undici
Benchmarks
The benchmark is a simple getting data example using a 50 TCP connections with a pipelining depth of 10 running on Node 20.10.0.
Tests | Samples | Result | Tolerance | Difference with slowest |
---|---|---|---|---|
undici - fetch | 30 | 3704.43 req/sec | ± 2.95 % | - |
http - no keepalive | 20 | 4275.30 req/sec | ± 2.60 % | + 15.41 % |
node-fetch | 10 | 4759.42 req/sec | ± 0.87 % | + 28.48 % |
request | 40 | 4803.37 req/sec | ± 2.77 % | + 29.67 % |
axios | 45 | 4951.97 req/sec | ± 2.88 % | + 33.68 % |
got | 10 | 5969.67 req/sec | ± 2.64 % | + 61.15 % |
superagent | 10 | 9471.48 req/sec | ± 1.50 % | + 155.68 % |
http - keepalive | 25 | 10327.49 req/sec | ± 2.95 % | + 178.79 % |
undici - pipeline | 10 | 15053.41 req/sec | ± 1.63 % | + 306.36 % |
undici - request | 10 | 19264.24 req/sec | ± 1.74 % | + 420.03 % |
undici - stream | 15 | 20317.29 req/sec | ± 2.13 % | + 448.46 % |
undici - dispatch | 10 | 24883.28 req/sec | ± 1.54 % | + 571.72 % |
The benchmark is a simple sending data example using a 50 TCP connections with a pipelining depth of 10 running on Node 20.10.0.
Tests | Samples | Result | Tolerance | Difference with slowest |
---|---|---|---|---|
undici - fetch | 20 | 1968.42 req/sec | ± 2.63 % | - |
http - no keepalive | 25 | 2330.30 req/sec | ± 2.99 % | + 18.38 % |
node-fetch | 20 | 2485.36 req/sec | ± 2.70 % | + 26.26 % |
got | 15 | 2787.68 req/sec | ± 2.56 % | + 41.62 % |
request | 30 | 2805.10 req/sec | ± 2.59 % | + 42.50 % |
axios | 10 | 3040.45 req/sec | ± 1.72 % | + 54.46 % |
superagent | 20 | 3358.29 req/sec | ± 2.51 % | + 70.61 % |
http - keepalive | 20 | 3477.94 req/sec | ± 2.51 % | + 76.69 % |
undici - pipeline | 25 | 3812.61 req/sec | ± 2.80 % | + 93.69 % |
undici - request | 10 | 6067.00 req/sec | ± 0.94 % | + 208.22 % |
undici - stream | 10 | 6391.61 req/sec | ± 1.98 % | + 224.71 % |
undici - dispatch | 10 | 6397.00 req/sec | ± 1.48 % | + 224.98 % |
Quick Start
import { request } from 'undici'
const {
statusCode,
headers,
trailers,
body
} = await request('http://localhost:3000/foo')
console.log('response received', statusCode)
console.log('headers', headers)
for await (const data of body) { console.log('data', data) }
console.log('trailers', trailers)
Body Mixins
The body
mixins are the most common way to format the request/response body. Mixins include:
[!NOTE] The body returned from
undici.request
does not implement.formData()
.
Example usage:
import { request } from 'undici'
const {
statusCode,
headers,
trailers,
body
} = await request('http://localhost:3000/foo')
console.log('response received', statusCode)
console.log('headers', headers)
console.log('data', await body.json())
console.log('trailers', trailers)
Note: Once a mixin has been called then the body cannot be reused, thus calling additional mixins on .body
, e.g. .body.json(); .body.text()
will result in an error TypeError: unusable
being thrown and returned through the Promise
rejection.
Should you need to access the body
in plain-text after using a mixin, the best practice is to use the .text()
mixin first and then manually parse the text to the desired format.
For more information about their behavior, please reference the body mixin from the Fetch Standard.
Common API Methods
This section documents our most commonly used API methods. Additional APIs are documented in their own files within the docs folder and are accessible via the navigation list on the left side of the docs site.
undici.request([url, options]): Promise
Arguments:
- url
string | URL | UrlObject
- options
RequestOptions
- dispatcher
Dispatcher
- Default: getGlobalDispatcher - method
String
- Default:PUT
ifoptions.body
, otherwiseGET
- dispatcher
Returns a promise with the result of the Dispatcher.request
method.
Calls options.dispatcher.request(options)
.
See Dispatcher.request for more details, and request examples for examples.
undici.stream([url, options, ]factory): Promise
Arguments:
- url
string | URL | UrlObject
- options
StreamOptions
- dispatcher
Dispatcher
- Default: getGlobalDispatcher - method
String
- Default:PUT
ifoptions.body
, otherwiseGET
- dispatcher
- factory
Dispatcher.stream.factory
Returns a promise with the result of the Dispatcher.stream
method.
Calls options.dispatcher.stream(options, factory)
.
See Dispatcher.stream for more details.
undici.pipeline([url, options, ]handler): Duplex
Arguments:
- url
string | URL | UrlObject
- options
PipelineOptions
- dispatcher
Dispatcher
- Default: getGlobalDispatcher - method
String
- Default:PUT
ifoptions.body
, otherwiseGET
- dispatcher
- handler
Dispatcher.pipeline.handler
Returns: stream.Duplex
Calls options.dispatch.pipeline(options, handler)
.
See Dispatcher.pipeline for more details.
undici.connect([url, options]): Promise
Starts two-way communications with the requested resource using HTTP CONNECT.
Arguments:
- url
string | URL | UrlObject
- options
ConnectOptions
- dispatcher
Dispatcher
- Default: getGlobalDispatcher
- dispatcher
- callback
(err: Error | null, data: ConnectData | null) => void
(optional)
Returns a promise with the result of the Dispatcher.connect
method.
Calls options.dispatch.connect(options)
.
See Dispatcher.connect for more details.
undici.fetch(input[, init]): Promise
Implements fetch.
- https://developer.mozilla.org/en-US/docs/Web/API/WindowOrWorkerGlobalScope/fetch
- https://fetch.spec.whatwg.org/#fetch-method
Basic usage example:
import { fetch } from 'undici'
const res = await fetch('https://example.com')
const json = await res.json()
console.log(json)
You can pass an optional dispatcher to fetch
as:
import { fetch, Agent } from 'undici'
const res = await fetch('https://example.com', {
// Mocks are also supported
dispatcher: new Agent({
keepAliveTimeout: 10,
keepAliveMaxTimeout: 10
})
})
const json = await res.json()
console.log(json)
request.body
A body can be of the following types:
- ArrayBuffer
- ArrayBufferView
- AsyncIterables
- Blob
- Iterables
- String
- URLSearchParams
- FormData
In this implementation of fetch, request.body
now accepts Async Iterables
. It is not present in the Fetch Standard.
import { fetch } from 'undici'
const data = {
async *[Symbol.asyncIterator]() {
yield 'hello'
yield 'world'
},
}
await fetch('https://example.com', { body: data, method: 'POST', duplex: 'half' })
FormData besides text data and buffers can also utilize streams via Blob objects:
import { openAsBlob } from 'node:fs'
const file = await openAsBlob('./big.csv')
const body = new FormData()
body.set('file', file, 'big.csv')
await fetch('http://example.com', { method: 'POST', body })
request.duplex
'half'
In this implementation of fetch, request.duplex
must be set if request.body
is ReadableStream
or Async Iterables
, however, even though the value must be set to 'half'
, it is actually a full duplex. For more detail refer to the Fetch Standard..
response.body
Nodejs has two kinds of streams: web streams, which follow the API of the WHATWG web standard found in browsers, and an older Node-specific streams API. response.body
returns a readable web stream. If you would prefer to work with a Node stream you can convert a web stream using .fromWeb()
.
import { fetch } from 'undici'
import { Readable } from 'node:stream'
const response = await fetch('https://example.com')
const readableWebStream = response.body
const readableNodeStream = Readable.fromWeb(readableWebStream)
Specification Compliance
This section documents parts of the Fetch Standard that Undici does not support or does not fully implement.
Garbage Collection
The Fetch Standard allows users to skip consuming the response body by relying on garbage collection to release connection resources. Undici does not do the same. Therefore, it is important to always either consume or cancel the response body.
Garbage collection in Node is less aggressive and deterministic (due to the lack of clear idle periods that browsers have through the rendering refresh rate) which means that leaving the release of connection resources to the garbage collector can lead to excessive connection usage, reduced performance (due to less connection re-use), and even stalls or deadlocks when running out of connections.
// Do
const headers = await fetch(url)
.then(async res => {
for await (const chunk of res.body) {
// force consumption of body
}
return res.headers
})
// Do not
const headers = await fetch(url)
.then(res => res.headers)
However, if you want to get only headers, it might be better to use HEAD
request method. Usage of this method will obviate the need for consumption or cancelling of the response body. See MDN - HTTP - HTTP request methods - HEAD for more details.
const headers = await fetch(url, { method: 'HEAD' })
.then(res => res.headers)
Forbidden and Safelisted Header Names
- https://fetch.spec.whatwg.org/#cors-safelisted-response-header-name
- https://fetch.spec.whatwg.org/#forbidden-header-name
- https://fetch.spec.whatwg.org/#forbidden-response-header-name
- https://github.com/wintercg/fetch/issues/6
The Fetch Standard requires implementations to exclude certain headers from requests and responses. In browser environments, some headers are forbidden so the user agent remains in full control over them. In Undici, these constraints are removed to give more control to the user.
undici.upgrade([url, options]): Promise
Upgrade to a different protocol. See MDN - HTTP - Protocol upgrade mechanism for more details.
Arguments:
- url
string | URL | UrlObject
- options
UpgradeOptions
- dispatcher
Dispatcher
- Default: getGlobalDispatcher
- dispatcher
- callback
(error: Error | null, data: UpgradeData) => void
(optional)
Returns a promise with the result of the Dispatcher.upgrade
method.
Calls options.dispatcher.upgrade(options)
.
See Dispatcher.upgrade for more details.
undici.setGlobalDispatcher(dispatcher)
- dispatcher
Dispatcher
Sets the global dispatcher used by Common API Methods.
undici.getGlobalDispatcher()
Gets the global dispatcher used by Common API Methods.
Returns: Dispatcher
undici.setGlobalOrigin(origin)
- origin
string | URL | undefined
Sets the global origin used in fetch
.
If undefined
is passed, the global origin will be reset. This will cause Response.redirect
, new Request()
, and fetch
to throw an error when a relative path is passed.
setGlobalOrigin('http://localhost:3000')
const response = await fetch('/api/ping')
console.log(response.url) // http://localhost:3000/api/ping
undici.getGlobalOrigin()
Gets the global origin used in fetch
.
Returns: URL
UrlObject
- port
string | number
(optional) - path
string
(optional) - pathname
string
(optional) - hostname
string
(optional) - origin
string
(optional) - protocol
string
(optional) - search
string
(optional)
Specification Compliance
This section documents parts of the HTTP/1.1 specification that Undici does not support or does not fully implement.
Expect
Undici does not support the Expect
request header field. The request
body is always immediately sent and the 100 Continue
response will be
ignored.
Refs: https://tools.ietf.org/html/rfc7231#section-5.1.1
Pipelining
Undici will only use pipelining if configured with a pipelining
factor
greater than 1
.
Undici always assumes that connections are persistent and will immediately pipeline requests, without checking whether the connection is persistent. Hence, automatic fallback to HTTP/1.0 or HTTP/1.1 without pipelining is not supported.
Undici will immediately pipeline when retrying requests after a failed connection. However, Undici will not retry the first remaining requests in the prior pipeline and instead error the corresponding callback/promise/stream.
Undici will abort all running requests in the pipeline when any of them are aborted.
- Refs: https://tools.ietf.org/html/rfc2616#section-8.1.2.2
- Refs: https://tools.ietf.org/html/rfc7230#section-6.3.2
Manual Redirect
Since it is not possible to manually follow an HTTP redirect on the server-side,
Undici returns the actual response instead of an opaqueredirect
filtered one
when invoked with a manual
redirect. This aligns fetch()
with the other
implementations in Deno and Cloudflare Workers.
Refs: https://fetch.spec.whatwg.org/#atomic-http-redirect-handling
Workarounds
Network address family autoselection.
If you experience problem when connecting to a remote server that is resolved by your DNS servers to a IPv6 (AAAA record)
first, there are chances that your local router or ISP might have problem connecting to IPv6 networks. In that case
undici will throw an error with code UND_ERR_CONNECT_TIMEOUT
.
If the target server resolves to both a IPv6 and IPv4 (A records) address and you are using a compatible Node version
(18.3.0 and above), you can fix the problem by providing the autoSelectFamily
option (support by both undici.request
and undici.Agent
) which will enable the family autoselection algorithm when establishing the connection.
Collaborators
- Daniele Belardi, https://www.npmjs.com/~dnlup
- Ethan Arrowood, https://www.npmjs.com/~ethan_arrowood
- Matteo Collina, https://www.npmjs.com/~matteo.collina
- Matthew Aitken, https://www.npmjs.com/~khaf
- Robert Nagy, https://www.npmjs.com/~ronag
- Szymon Marczak, https://www.npmjs.com/~szmarczak
- Tomas Della Vedova, https://www.npmjs.com/~delvedor
Releasers
- Ethan Arrowood, https://www.npmjs.com/~ethan_arrowood
- Matteo Collina, https://www.npmjs.com/~matteo.collina
- Robert Nagy, https://www.npmjs.com/~ronag
- Matthew Aitken, https://www.npmjs.com/~khaf
License
MIT
Top Related Projects
Promise based HTTP client for the browser and node.js
🏊🏾 Simplified HTTP request client.
🌐 Human-friendly and powerful HTTP request library for Node.js
Ajax for Node.js and browsers (JS HTTP client). Maintained for @forwardemail, @ladjs, @spamscanner, @breejs, @cabinjs, and @lassjs.
A light-weight module that brings the Fetch API to Node.js
Functional JS HTTP client (Node.js & Fetch) w/ async await
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot