Top Related Projects
http request/response parser for c
Simple, secure & standards compliant web server for the most demanding of applications
Port of http_parser to llparse
tiny HTTP parser written in C (used in HTTP::Parser::XS et al.)
An HTTP/1.1 client, written from scratch for Node.js
2x faster than JSON.stringify()
Quick Overview
http-parser is a lightweight, fast HTTP request/response parser for C. It's designed to be used in performance-critical applications and is used in Node.js. The parser is optimized for both speed and memory efficiency, making it suitable for high-performance web servers and other HTTP-based applications.
Pros
- Extremely fast and lightweight
- Low memory footprint
- Portable and easy to integrate into existing projects
- Supports both HTTP/1.0 and HTTP/1.1
Cons
- Limited to C language, requiring bindings for use in other languages
- Doesn't handle higher-level HTTP operations (e.g., routing, middleware)
- Requires manual memory management
- Limited documentation for advanced use cases
Code Examples
- Initializing the parser:
http_parser_settings settings;
http_parser *parser = malloc(sizeof(http_parser));
http_parser_init(parser, HTTP_REQUEST);
- Setting up callbacks:
http_parser_settings_init(&settings);
settings.on_url = on_url_callback;
settings.on_header_field = on_header_field_callback;
settings.on_header_value = on_header_value_callback;
settings.on_body = on_body_callback;
- Parsing incoming data:
char *data = "GET /index.html HTTP/1.1\r\nHost: example.com\r\n\r\n";
size_t parsed = http_parser_execute(parser, &settings, data, strlen(data));
Getting Started
To use http-parser in your project:
-
Clone the repository:
git clone https://github.com/nodejs/http-parser.git
-
Include the header file in your C project:
#include "http_parser.h"
-
Compile the library with your project:
gcc your_project.c http_parser.c -o your_project
-
Initialize the parser and set up callbacks as shown in the code examples above.
-
Use
http_parser_execute()
to parse incoming HTTP data.
Remember to handle memory allocation and deallocation appropriately when using the parser.
Competitor Comparisons
http request/response parser for c
Pros of http-parser
- Lightweight and efficient HTTP parsing
- Widely used and battle-tested in production environments
- Supports both HTTP/1.x and HTTP/2
Cons of http-parser
- Limited to HTTP parsing only
- Requires manual memory management in some cases
- Less actively maintained compared to newer alternatives
Code Comparison
http-parser:
http_parser_settings settings;
settings.on_message_begin = on_message_begin;
settings.on_url = on_url;
settings.on_header_field = on_header_field;
settings.on_header_value = on_header_value;
Both repositories appear to be the same project, so there isn't a meaningful code comparison to make between them. The http-parser project is a single repository that provides HTTP parsing functionality for Node.js and other projects.
Summary
http-parser is a robust and widely-used HTTP parsing library. It offers efficient parsing for both HTTP/1.x and HTTP/2 protocols. However, it's limited to HTTP parsing only and may require manual memory management in certain scenarios. The project has been a cornerstone of Node.js HTTP handling but has seen reduced maintenance activity in recent years as newer alternatives have emerged.
Simple, secure & standards compliant web server for the most demanding of applications
Pros of uWebSockets
- Higher performance and lower latency for WebSocket connections
- Built-in support for SSL/TLS and HTTP/HTTPS
- More comprehensive WebSocket implementation with additional features
Cons of uWebSockets
- Less focused on HTTP parsing compared to http-parser
- Steeper learning curve due to more complex API
- May be overkill for simple HTTP-only applications
Code Comparison
http-parser:
http_parser_settings settings;
settings.on_message_begin = on_message_begin;
settings.on_url = on_url;
settings.on_header_field = on_header_field;
settings.on_header_value = on_header_value;
uWebSockets:
uWS::App().get("/hello", [](auto *res, auto *req) {
res->end("Hello World!");
}).listen(9001, [](auto *token) {
if (token) {
std::cout << "Listening on port 9001" << std::endl;
}
});
Summary
http-parser is a lightweight, focused HTTP parsing library, while uWebSockets is a more comprehensive networking solution with a strong emphasis on WebSocket performance. http-parser is simpler to use for basic HTTP parsing tasks, while uWebSockets offers more features and better performance for WebSocket-heavy applications. The choice between the two depends on the specific requirements of your project.
Port of http_parser to llparse
Pros of llhttp
- Significantly faster parsing performance
- Smaller memory footprint
- Written in TypeScript, offering better type safety and maintainability
Cons of llhttp
- Relatively newer project, potentially less battle-tested
- May require additional setup or configuration in existing Node.js projects
Code Comparison
http-parser:
int http_parser_execute(http_parser *parser,
const http_parser_settings *settings,
const char *data,
size_t len)
llhttp:
export function llhttp_execute(
parser: llhttp_t,
data: Buffer,
len: number
): number
Key Differences
- llhttp is designed as a drop-in replacement for http-parser
- llhttp uses a generated state machine for parsing, while http-parser uses a hand-written one
- llhttp offers better performance in most scenarios, especially for larger payloads
- http-parser is written in C, while llhttp is written in TypeScript and compiled to C
Use Cases
- http-parser: Legacy Node.js applications, projects requiring maximum stability
- llhttp: New Node.js projects, applications seeking improved performance and reduced resource usage
Both libraries serve similar purposes, but llhttp represents a more modern approach with performance benefits. The choice between them often depends on specific project requirements and compatibility considerations.
tiny HTTP parser written in C (used in HTTP::Parser::XS et al.)
Pros of picohttpparser
- Significantly faster parsing speed, especially for small requests
- Smaller codebase, making it easier to maintain and integrate
- Designed for high-performance HTTP servers like H2O
Cons of picohttpparser
- Less feature-rich compared to http-parser
- Limited support for HTTP protocol versions (mainly focuses on HTTP/1.x)
- May require more manual handling for certain edge cases
Code Comparison
picohttpparser:
struct phr_header headers[100];
size_t num_headers = 100;
const char *method, *path;
int pret = phr_parse_request(buf, len, &method, &method_len, &path, &path_len,
&minor_version, headers, &num_headers, last_len);
http-parser:
http_parser_settings settings;
http_parser *parser = malloc(sizeof(http_parser));
http_parser_init(parser, HTTP_REQUEST);
parser->data = my_socket;
size_t nparsed = http_parser_execute(parser, &settings, buf, recved);
Summary
picohttpparser offers superior performance and a more compact codebase, making it ideal for high-performance scenarios. However, http-parser provides more comprehensive HTTP protocol support and features. The choice between the two depends on specific project requirements, with picohttpparser excelling in speed-critical applications and http-parser offering broader functionality.
An HTTP/1.1 client, written from scratch for Node.js
Pros of undici
- Higher performance and throughput compared to http-parser
- More modern codebase with better maintainability
- Native support for HTTP/2 and HTTP/3
Cons of undici
- Larger codebase and potentially higher memory footprint
- Steeper learning curve for developers familiar with http-parser
- May require more configuration for advanced use cases
Code Comparison
http-parser:
int http_parser_execute(http_parser *parser,
const http_parser_settings *settings,
const char *data,
size_t len)
undici:
const client = new Client('http://localhost:3000')
const { statusCode, headers, body } = await client.request({
method: 'GET',
path: '/foo'
})
Key Differences
- http-parser is written in C, while undici is written in JavaScript
- http-parser focuses on low-level parsing, while undici provides a higher-level API
- undici offers more features out of the box, such as connection pooling and automatic retries
- http-parser is more lightweight and may be better suited for embedded systems or resource-constrained environments
- undici is designed specifically for Node.js, while http-parser can be used in various environments
Both libraries have their strengths, and the choice between them depends on specific project requirements, performance needs, and developer preferences.
2x faster than JSON.stringify()
Pros of fast-json-stringify
- Specialized for JSON serialization, offering better performance for this specific task
- Supports schema-based serialization, allowing for optimized and type-safe JSON generation
- Actively maintained with frequent updates and improvements
Cons of fast-json-stringify
- Limited to JSON serialization, while http-parser handles general HTTP parsing
- May require additional setup and schema definition for optimal use
- Potentially larger bundle size due to its specialized nature
Code Comparison
fast-json-stringify:
const fastJson = require('fast-json-stringify')
const stringify = fastJson({
type: 'object',
properties: {
name: { type: 'string' },
age: { type: 'integer' }
}
})
console.log(stringify({ name: 'John', age: 30 }))
http-parser:
const http = require('http')
const HTTPParser = process.binding('http_parser').HTTPParser
const parser = new HTTPParser(HTTPParser.REQUEST)
parser.execute(Buffer.from('GET / HTTP/1.1\r\n\r\n'))
Summary
fast-json-stringify is a specialized tool for JSON serialization, offering high performance and schema-based generation. It's ideal for projects heavily focused on JSON output. http-parser, on the other hand, is a more general-purpose HTTP parsing library, suitable for a wider range of HTTP-related tasks. The choice between them depends on the specific requirements of your project and whether you need specialized JSON handling or broader HTTP parsing capabilities.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
HTTP Parser
http-parser is not actively maintained. New projects and projects looking to migrate should consider llhttp.
This is a parser for HTTP messages written in C. It parses both requests and responses. The parser is designed to be used in performance HTTP applications. It does not make any syscalls nor allocations, it does not buffer data, it can be interrupted at anytime. Depending on your architecture, it only requires about 40 bytes of data per message stream (in a web server that is per connection).
Features:
- No dependencies
- Handles persistent streams (keep-alive).
- Decodes chunked encoding.
- Upgrade support
- Defends against buffer overflow attacks.
The parser extracts the following information from HTTP messages:
- Header fields and values
- Content-Length
- Request method
- Response status code
- Transfer-Encoding
- HTTP version
- Request URL
- Message body
Usage
One http_parser
object is used per TCP connection. Initialize the struct
using http_parser_init()
and set the callbacks. That might look something
like this for a request parser:
http_parser_settings settings;
settings.on_url = my_url_callback;
settings.on_header_field = my_header_field_callback;
/* ... */
http_parser *parser = malloc(sizeof(http_parser));
http_parser_init(parser, HTTP_REQUEST);
parser->data = my_socket;
When data is received on the socket execute the parser and check for errors.
size_t len = 80*1024, nparsed;
char buf[len];
ssize_t recved;
recved = recv(fd, buf, len, 0);
if (recved < 0) {
/* Handle error. */
}
/* Start up / continue the parser.
* Note we pass recved==0 to signal that EOF has been received.
*/
nparsed = http_parser_execute(parser, &settings, buf, recved);
if (parser->upgrade) {
/* handle new protocol */
} else if (nparsed != recved) {
/* Handle error. Usually just close the connection. */
}
http_parser
needs to know where the end of the stream is. For example, sometimes
servers send responses without Content-Length and expect the client to
consume input (for the body) until EOF. To tell http_parser
about EOF, give
0
as the fourth parameter to http_parser_execute()
. Callbacks and errors
can still be encountered during an EOF, so one must still be prepared
to receive them.
Scalar valued message information such as status_code
, method
, and the
HTTP version are stored in the parser structure. This data is only
temporally stored in http_parser
and gets reset on each new message. If
this information is needed later, copy it out of the structure during the
headers_complete
callback.
The parser decodes the transfer-encoding for both requests and responses transparently. That is, a chunked encoding is decoded before being sent to the on_body callback.
The Special Problem of Upgrade
http_parser
supports upgrading the connection to a different protocol. An
increasingly common example of this is the WebSocket protocol which sends
a request like
GET /demo HTTP/1.1
Upgrade: WebSocket
Connection: Upgrade
Host: example.com
Origin: http://example.com
WebSocket-Protocol: sample
followed by non-HTTP data.
(See RFC6455 for more information the WebSocket protocol.)
To support this, the parser will treat this as a normal HTTP message without a body, issuing both on_headers_complete and on_message_complete callbacks. However http_parser_execute() will stop parsing at the end of the headers and return.
The user is expected to check if parser->upgrade
has been set to 1 after
http_parser_execute()
returns. Non-HTTP data begins at the buffer supplied
offset by the return value of http_parser_execute()
.
Callbacks
During the http_parser_execute()
call, the callbacks set in
http_parser_settings
will be executed. The parser maintains state and
never looks behind, so buffering the data is not necessary. If you need to
save certain data for later usage, you can do that from the callbacks.
There are two types of callbacks:
- notification
typedef int (*http_cb) (http_parser*);
Callbacks: on_message_begin, on_headers_complete, on_message_complete. - data
typedef int (*http_data_cb) (http_parser*, const char *at, size_t length);
Callbacks: (requests only) on_url, (common) on_header_field, on_header_value, on_body;
Callbacks must return 0 on success. Returning a non-zero value indicates error to the parser, making it exit immediately.
For cases where it is necessary to pass local information to/from a callback,
the http_parser
object's data
field can be used.
An example of such a case is when using threads to handle a socket connection,
parse a request, and then give a response over that socket. By instantiation
of a thread-local struct containing relevant data (e.g. accepted socket,
allocated memory for callbacks to write into, etc), a parser's callbacks are
able to communicate data between the scope of the thread and the scope of the
callback in a threadsafe manner. This allows http_parser
to be used in
multi-threaded contexts.
Example:
typedef struct {
socket_t sock;
void* buffer;
int buf_len;
} custom_data_t;
int my_url_callback(http_parser* parser, const char *at, size_t length) {
/* access to thread local custom_data_t struct.
Use this access save parsed data for later use into thread local
buffer, or communicate over socket
*/
parser->data;
...
return 0;
}
...
void http_parser_thread(socket_t sock) {
int nparsed = 0;
/* allocate memory for user data */
custom_data_t *my_data = malloc(sizeof(custom_data_t));
/* some information for use by callbacks.
* achieves thread -> callback information flow */
my_data->sock = sock;
/* instantiate a thread-local parser */
http_parser *parser = malloc(sizeof(http_parser));
http_parser_init(parser, HTTP_REQUEST); /* initialise parser */
/* this custom data reference is accessible through the reference to the
parser supplied to callback functions */
parser->data = my_data;
http_parser_settings settings; /* set up callbacks */
settings.on_url = my_url_callback;
/* execute parser */
nparsed = http_parser_execute(parser, &settings, buf, recved);
...
/* parsed information copied from callback.
can now perform action on data copied into thread-local memory from callbacks.
achieves callback -> thread information flow */
my_data->buffer;
...
}
In case you parse HTTP message in chunks (i.e. read()
request line
from socket, parse, read half headers, parse, etc) your data callbacks
may be called more than once. http_parser
guarantees that data pointer is only
valid for the lifetime of callback. You can also read()
into a heap allocated
buffer to avoid copying memory around if this fits your application.
Reading headers may be a tricky task if you read/parse headers partially. Basically, you need to remember whether last header callback was field or value and apply the following logic:
(on_header_field and on_header_value shortened to on_h_*)
------------------------ ------------ --------------------------------------------
| State (prev. callback) | Callback | Description/action |
------------------------ ------------ --------------------------------------------
| nothing (first call) | on_h_field | Allocate new buffer and copy callback data |
| | | into it |
------------------------ ------------ --------------------------------------------
| value | on_h_field | New header started. |
| | | Copy current name,value buffers to headers |
| | | list and allocate new buffer for new name |
------------------------ ------------ --------------------------------------------
| field | on_h_field | Previous name continues. Reallocate name |
| | | buffer and append callback data to it |
------------------------ ------------ --------------------------------------------
| field | on_h_value | Value for current header started. Allocate |
| | | new buffer and copy callback data to it |
------------------------ ------------ --------------------------------------------
| value | on_h_value | Value continues. Reallocate value buffer |
| | | and append callback data to it |
------------------------ ------------ --------------------------------------------
Parsing URLs
A simplistic zero-copy URL parser is provided as http_parser_parse_url()
.
Users of this library may wish to use it to parse URLs constructed from
consecutive on_url
callbacks.
See examples of reading in headers:
- partial example in C
- from http-parser tests in C
- from Node library in Javascript
Top Related Projects
http request/response parser for c
Simple, secure & standards compliant web server for the most demanding of applications
Port of http_parser to llparse
tiny HTTP parser written in C (used in HTTP::Parser::XS et al.)
An HTTP/1.1 client, written from scratch for Node.js
2x faster than JSON.stringify()
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot