Convert Figma logo to code with AI

buger logojsonparser

One of the fastest alternative JSON parser for Go that does not require schema

5,427
435
5,427
71

Top Related Projects

Fast JSON parser and validator for Go. No custom structs, no code generation, no reflection

14,239

Get JSON values quickly - JSON parser for Go

Fast JSON serializer for golang.

13,330

A high-performance 100% compatible drop-in replacement of "encoding/json"

3,445

For parsing, creating and editing unknown or dynamic JSON in Go

Quick Overview

buger/jsonparser is a fast and efficient JSON parser for Go. It allows for parsing and extracting values from JSON payloads without the need for temporary objects or full unmarshaling. This library is designed for high-performance scenarios where speed and low memory usage are crucial.

Pros

  • Extremely fast JSON parsing and value extraction
  • Low memory footprint, as it doesn't create temporary objects
  • Supports parsing of nested structures and arrays
  • Can work with large JSON payloads efficiently

Cons

  • Less convenient for complex JSON structures compared to standard encoding/json
  • Limited support for modifying JSON data
  • Requires manual type handling for extracted values
  • May have a steeper learning curve for developers used to traditional JSON parsing

Code Examples

  1. Parsing a simple JSON object:
json := []byte(`{"name": "John", "age": 30}`)
name, err := jsonparser.GetString(json, "name")
if err == nil {
    fmt.Println(name) // Output: John
}
  1. Extracting a nested value:
json := []byte(`{"user": {"details": {"city": "New York"}}}`)
city, err := jsonparser.GetString(json, "user", "details", "city")
if err == nil {
    fmt.Println(city) // Output: New York
}
  1. Iterating through an array:
json := []byte(`{"fruits": ["apple", "banana", "orange"]}`)
jsonparser.ArrayEach(json, func(value []byte, dataType jsonparser.ValueType, offset int, err error) {
    fmt.Println(string(value))
}, "fruits")
// Output:
// apple
// banana
// orange

Getting Started

To use buger/jsonparser in your Go project, follow these steps:

  1. Install the package:

    go get github.com/buger/jsonparser
    
  2. Import the package in your Go file:

    import "github.com/buger/jsonparser"
    
  3. Start parsing JSON data using the provided functions:

    json := []byte(`{"name": "Alice", "age": 25}`)
    name, _ := jsonparser.GetString(json, "name")
    age, _ := jsonparser.GetInt(json, "age")
    fmt.Printf("Name: %s, Age: %d\n", name, age)
    

This library is now ready to use in your Go project for efficient JSON parsing and value extraction.

Competitor Comparisons

Fast JSON parser and validator for Go. No custom structs, no code generation, no reflection

Pros of fastjson

  • Generally faster parsing speed, especially for large JSON files
  • Lower memory usage due to its zero-allocation approach
  • Supports streaming JSON parsing for handling large datasets

Cons of fastjson

  • Less feature-rich compared to jsonparser
  • API may be less intuitive for some developers
  • Limited support for complex JSON manipulation

Code Comparison

jsonparser:

data := []byte(`{"name": "John", "age": 30}`)
name, err := jsonparser.GetString(data, "name")
age, err := jsonparser.GetInt(data, "age")

fastjson:

var p fastjson.Parser
v, err := p.Parse(`{"name": "John", "age": 30}`)
name := v.GetStringBytes("name")
age := v.GetInt("age")

Both libraries offer efficient JSON parsing, but they differ in their approach and feature set. jsonparser provides a more extensive API with various helper functions, making it easier to work with complex JSON structures. fastjson, on the other hand, focuses on raw performance and low memory usage, making it ideal for high-performance applications dealing with large JSON datasets.

The choice between the two depends on the specific requirements of your project, balancing between ease of use, feature richness, and performance needs.

14,239

Get JSON values quickly - JSON parser for Go

Pros of gjson

  • More feature-rich with additional functionality like JSON modification and creation
  • Supports a wider range of query types, including array filtering and conditional logic
  • Generally faster for complex queries and large JSON documents

Cons of gjson

  • Slightly larger memory footprint
  • May be overkill for simple JSON parsing tasks
  • Learning curve can be steeper due to more advanced features

Code Comparison

jsonparser:

data := []byte(`{"name": "John", "age": 30}`)
value, _ := jsonparser.GetString(data, "name")
fmt.Println(value)

gjson:

json := `{"name": "John", "age": 30}`
value := gjson.Get(json, "name").String()
fmt.Println(value)

Both libraries offer efficient JSON parsing, but gjson provides a more extensive set of features for complex JSON manipulation. jsonparser is lighter and may be preferable for simple parsing tasks, while gjson excels in scenarios requiring advanced querying and modification capabilities. The choice between them depends on the specific requirements of your project and the complexity of JSON operations needed.

Fast JSON serializer for golang.

Pros of easyjson

  • Generates code for faster JSON encoding/decoding
  • Supports custom types and interfaces
  • Provides both marshaling and unmarshaling capabilities

Cons of easyjson

  • Requires code generation step, which adds complexity to the build process
  • May produce larger binary sizes due to generated code
  • Less flexible for parsing arbitrary JSON structures

Code Comparison

easyjson:

//easyjson:json
type Person struct {
    Name string `json:"name"`
    Age  int    `json:"age"`
}

// Usage
person := &Person{}
err := easyjson.Unmarshal(data, person)

jsonparser:

name, err := jsonparser.GetString(data, "name")
age, err := jsonparser.GetInt(data, "age")

// Usage
person := &Person{Name: name, Age: int(age)}

Key Differences

  • easyjson generates code for specific structs, while jsonparser works with raw JSON data
  • jsonparser is more lightweight and doesn't require code generation
  • easyjson provides full marshaling/unmarshaling, while jsonparser focuses on parsing and extracting values
  • jsonparser may be more suitable for scenarios where you need to extract specific fields from large JSON payloads
  • easyjson is generally faster for encoding/decoding complete structs

Both libraries have their strengths and are suitable for different use cases. Choose based on your specific requirements and performance needs.

13,330

A high-performance 100% compatible drop-in replacement of "encoding/json"

Pros of json-iterator/go

  • Higher performance for encoding and decoding JSON
  • More feature-rich, supporting custom marshalers and unmarshalers
  • Better compatibility with the standard library's encoding/json package

Cons of json-iterator/go

  • Slightly more complex API compared to jsonparser's simplicity
  • Larger memory footprint due to its feature set
  • May require more setup for advanced use cases

Code Comparison

jsonparser:

value, err := jsonparser.GetString(data, "user", "name")
if err != nil {
    // Handle error
}

json-iterator/go:

iter := jsoniter.ParseBytes(jsoniter.ConfigDefault, data)
value := iter.Get("user", "name").ToString()
if iter.Error != nil {
    // Handle error
}

Both libraries offer efficient JSON parsing, but json-iterator/go provides a more comprehensive solution with better performance and compatibility with the standard library. However, jsonparser excels in simplicity and memory efficiency for basic parsing tasks. The choice between them depends on the specific requirements of your project, such as performance needs, feature requirements, and compatibility with existing code.

3,445

For parsing, creating and editing unknown or dynamic JSON in Go

Pros of gabs

  • More feature-rich, offering advanced JSON manipulation capabilities
  • Supports both parsing and creation of JSON structures
  • Provides a fluent, chainable API for easier JSON traversal and modification

Cons of gabs

  • Generally slower performance compared to jsonparser
  • Higher memory usage due to its more complex internal structure
  • Steeper learning curve for basic JSON parsing tasks

Code Comparison

jsonparser:

data := []byte(`{"name": "John", "age": 30}`)
name, err := jsonparser.GetString(data, "name")

gabs:

jsonParsed, _ := gabs.ParseJSON([]byte(`{"name": "John", "age": 30}`))
name := jsonParsed.Path("name").Data().(string)

Both libraries offer JSON parsing capabilities, but gabs provides a more intuitive API for complex operations. jsonparser focuses on high-performance parsing with a simpler API, while gabs offers a wider range of features for JSON manipulation at the cost of some performance.

jsonparser is ideal for projects requiring fast, lightweight JSON parsing, especially when dealing with large JSON payloads. gabs is better suited for applications that need extensive JSON manipulation and creation capabilities, and where ease of use is prioritized over raw performance.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Go Report Card License

Alternative JSON parser for Go (10x times faster standard library)

It does not require you to know the structure of the payload (eg. create structs), and allows accessing fields by providing the path to them. It is up to 10 times faster than standard encoding/json package (depending on payload size and usage), allocates no memory. See benchmarks below.

Rationale

Originally I made this for a project that relies on a lot of 3rd party APIs that can be unpredictable and complex. I love simplicity and prefer to avoid external dependecies. encoding/json requires you to know exactly your data structures, or if you prefer to use map[string]interface{} instead, it will be very slow and hard to manage. I investigated what's on the market and found that most libraries are just wrappers around encoding/json, there is few options with own parsers (ffjson, easyjson), but they still requires you to create data structures.

Goal of this project is to push JSON parser to the performance limits and not sacrifice with compliance and developer user experience.

Example

For the given JSON our goal is to extract the user's full name, number of github followers and avatar.

import "github.com/buger/jsonparser"

...

data := []byte(`{
  "person": {
    "name": {
      "first": "Leonid",
      "last": "Bugaev",
      "fullName": "Leonid Bugaev"
    },
    "github": {
      "handle": "buger",
      "followers": 109
    },
    "avatars": [
      { "url": "https://avatars1.githubusercontent.com/u/14009?v=3&s=460", "type": "thumbnail" }
    ]
  },
  "company": {
    "name": "Acme"
  }
}`)

// You can specify key path by providing arguments to Get function
jsonparser.Get(data, "person", "name", "fullName")

// There is `GetInt` and `GetBoolean` helpers if you exactly know key data type
jsonparser.GetInt(data, "person", "github", "followers")

// When you try to get object, it will return you []byte slice pointer to data containing it
// In `company` it will be `{"name": "Acme"}`
jsonparser.Get(data, "company")

// If the key doesn't exist it will throw an error
var size int64
if value, err := jsonparser.GetInt(data, "company", "size"); err == nil {
  size = value
}

// You can use `ArrayEach` helper to iterate items [item1, item2 .... itemN]
jsonparser.ArrayEach(data, func(value []byte, dataType jsonparser.ValueType, offset int, err error) {
	fmt.Println(jsonparser.Get(value, "url"))
}, "person", "avatars")

// Or use can access fields by index!
jsonparser.GetString(data, "person", "avatars", "[0]", "url")

// You can use `ObjectEach` helper to iterate objects { "key1":object1, "key2":object2, .... "keyN":objectN }
jsonparser.ObjectEach(data, func(key []byte, value []byte, dataType jsonparser.ValueType, offset int) error {
        fmt.Printf("Key: '%s'\n Value: '%s'\n Type: %s\n", string(key), string(value), dataType)
	return nil
}, "person", "name")

// The most efficient way to extract multiple keys is `EachKey`

paths := [][]string{
  []string{"person", "name", "fullName"},
  []string{"person", "avatars", "[0]", "url"},
  []string{"company", "url"},
}
jsonparser.EachKey(data, func(idx int, value []byte, vt jsonparser.ValueType, err error){
  switch idx {
  case 0: // []string{"person", "name", "fullName"}
    ...
  case 1: // []string{"person", "avatars", "[0]", "url"}
    ...
  case 2: // []string{"company", "url"},
    ...
  }
}, paths...)

// For more information see docs below

Reference

Library API is really simple. You just need the Get method to perform any operation. The rest is just helpers around it.

You also can view API at godoc.org

Get

func Get(data []byte, keys ...string) (value []byte, dataType jsonparser.ValueType, offset int, err error)

Receives data structure, and key path to extract value from.

Returns:

  • value - Pointer to original data structure containing key value, or just empty slice if nothing found or error
  • dataType - Can be: NotExist, String, Number, Object, Array, Boolean or Null
  • offset - Offset from provided data structure where key value ends. Used mostly internally, for example for ArrayEach helper.
  • err - If the key is not found or any other parsing issue, it should return error. If key not found it also sets dataType to NotExist

Accepts multiple keys to specify path to JSON value (in case of quering nested structures). If no keys are provided it will try to extract the closest JSON value (simple ones or object/array), useful for reading streams or arrays, see ArrayEach implementation.

Note that keys can be an array indexes: jsonparser.GetInt("person", "avatars", "[0]", "url"), pretty cool, yeah?

GetString

func GetString(data []byte, keys ...string) (val string, err error)

Returns strings properly handing escaped and unicode characters. Note that this will cause additional memory allocations.

GetUnsafeString

If you need string in your app, and ready to sacrifice with support of escaped symbols in favor of speed. It returns string mapped to existing byte slice memory, without any allocations:

s, _, := jsonparser.GetUnsafeString(data, "person", "name", "title")
switch s {
  case 'CEO':
    ...
  case 'Engineer'
    ...
  ...
}

Note that unsafe here means that your string will exist until GC will free underlying byte slice, for most of cases it means that you can use this string only in current context, and should not pass it anywhere externally: through channels or any other way.

GetBoolean, GetInt and GetFloat

func GetBoolean(data []byte, keys ...string) (val bool, err error)

func GetFloat(data []byte, keys ...string) (val float64, err error)

func GetInt(data []byte, keys ...string) (val int64, err error)

If you know the key type, you can use the helpers above. If key data type do not match, it will return error.

ArrayEach

func ArrayEach(data []byte, cb func(value []byte, dataType jsonparser.ValueType, offset int, err error), keys ...string)

Needed for iterating arrays, accepts a callback function with the same return arguments as Get.

ObjectEach

func ObjectEach(data []byte, callback func(key []byte, value []byte, dataType ValueType, offset int) error, keys ...string) (err error)

Needed for iterating object, accepts a callback function. Example:

var handler func([]byte, []byte, jsonparser.ValueType, int) error
handler = func(key []byte, value []byte, dataType jsonparser.ValueType, offset int) error {
	//do stuff here
}
jsonparser.ObjectEach(myJson, handler)

EachKey

func EachKey(data []byte, cb func(idx int, value []byte, dataType jsonparser.ValueType, err error), paths ...[]string)

When you need to read multiple keys, and you do not afraid of low-level API EachKey is your friend. It read payload only single time, and calls callback function once path is found. For example when you call multiple times Get, it has to process payload multiple times, each time you call it. Depending on payload EachKey can be multiple times faster than Get. Path can use nested keys as well!

paths := [][]string{
	[]string{"uuid"},
	[]string{"tz"},
	[]string{"ua"},
	[]string{"st"},
}
var data SmallPayload

jsonparser.EachKey(smallFixture, func(idx int, value []byte, vt jsonparser.ValueType, err error){
	switch idx {
	case 0:
		data.Uuid, _ = value
	case 1:
		v, _ := jsonparser.ParseInt(value)
		data.Tz = int(v)
	case 2:
		data.Ua, _ = value
	case 3:
		v, _ := jsonparser.ParseInt(value)
		data.St = int(v)
	}
}, paths...)

Set

func Set(data []byte, setValue []byte, keys ...string) (value []byte, err error)

Receives existing data structure, key path to set, and value to set at that key. This functionality is experimental.

Returns:

  • value - Pointer to original data structure with updated or added key value.
  • err - If any parsing issue, it should return error.

Accepts multiple keys to specify path to JSON value (in case of updating or creating nested structures).

Note that keys can be an array indexes: jsonparser.Set(data, []byte("http://github.com"), "person", "avatars", "[0]", "url")

Delete

func Delete(data []byte, keys ...string) value []byte

Receives existing data structure, and key path to delete. This functionality is experimental.

Returns:

  • value - Pointer to original data structure with key path deleted if it can be found. If there is no key path, then the whole data structure is deleted.

Accepts multiple keys to specify path to JSON value (in case of updating or creating nested structures).

Note that keys can be an array indexes: jsonparser.Delete(data, "person", "avatars", "[0]", "url")

What makes it so fast?

  • It does not rely on encoding/json, reflection or interface{}, the only real package dependency is bytes.
  • Operates with JSON payload on byte level, providing you pointers to the original data structure: no memory allocation.
  • No automatic type conversions, by default everything is a []byte, but it provides you value type, so you can convert by yourself (there is few helpers included).
  • Does not parse full record, only keys you specified

Benchmarks

There are 3 benchmark types, trying to simulate real-life usage for small, medium and large JSON payloads. For each metric, the lower value is better. Time/op is in nanoseconds. Values better than standard encoding/json marked as bold text. Benchmarks run on standard Linode 1024 box.

Compared libraries:

TLDR

If you want to skip next sections we have 2 winner: jsonparser and easyjson. jsonparser is up to 10 times faster than standard encoding/json package (depending on payload size and usage), and almost infinitely (literally) better in memory consumption because it operates with data on byte level, and provide direct slice pointers. easyjson wins in CPU in medium tests and frankly i'm impressed with this package: it is remarkable results considering that it is almost drop-in replacement for encoding/json (require some code generation).

It's hard to fully compare jsonparser and easyjson (or ffson), they a true parsers and fully process record, unlike jsonparser which parse only keys you specified.

If you searching for replacement of encoding/json while keeping structs, easyjson is an amazing choice. If you want to process dynamic JSON, have memory constrains, or more control over your data you should try jsonparser.

jsonparser performance heavily depends on usage, and it works best when you do not need to process full record, only some keys. The more calls you need to make, the slower it will be, in contrast easyjson (or ffjson, encoding/json) parser record only 1 time, and then you can make as many calls as you want.

With great power comes great responsibility! :)

Small payload

Each test processes 190 bytes of http log as a JSON record. It should read multiple fields. https://github.com/buger/jsonparser/blob/master/benchmark/benchmark_small_payload_test.go

Librarytime/opbytes/opallocs/op
encoding/json struct787988018
encoding/json interface{}8946152138
Jeffail/gabs10053164946
bitly/go-simplejson10128224136
antonholmquist/jason271527237101
github.com/ugorji/go/codec8806217631
mreiferson/go-ujson7008140937
a8m/djson3862124930
pquerna/ffjson376962415
mailru/easyjson20021929
buger/jsonparser136700
buger/jsonparser (EachKey API)80900

Winners are ffjson, easyjson and jsonparser, where jsonparser is up to 9.8x faster than encoding/json and 4.6x faster than ffjson, and slightly faster than easyjson. If you look at memory allocation, jsonparser has no rivals, as it makes no data copy and operates with raw []byte structures and pointers to it.

Medium payload

Each test processes a 2.4kb JSON record (based on Clearbit API). It should read multiple nested fields and 1 array.

https://github.com/buger/jsonparser/blob/master/benchmark/benchmark_medium_payload_test.go

Librarytime/opbytes/opallocs/op
encoding/json struct57749133629
encoding/json interface{}7929710627215
Jeffail/gabs8380711202235
bitly/go-simplejson8818717187220
antonholmquist/jason9409919013247
github.com/ugorji/go/codec1147196712152
mreiferson/go-ujson5697211547270
a8m/djson2852510196198
pquerna/ffjson2029885620
mailru/easyjson1051233612
buger/jsonparser1595500
buger/jsonparser (EachKey API)891600

The difference between ffjson and jsonparser in CPU usage is smaller, while the memory consumption difference is growing. On the other hand easyjson shows remarkable performance for medium payload.

gabs, go-simplejson and jason are based on encoding/json and map[string]interface{} and actually only helpers for unstructured JSON, their performance correlate with encoding/json interface{}, and they will skip next round. go-ujson while have its own parser, shows same performance as encoding/json, also skips next round. Same situation with ugorji/go/codec, but it showed unexpectedly bad performance for complex payloads.

Large payload

Each test processes a 24kb JSON record (based on Discourse API) It should read 2 arrays, and for each item in array get a few fields. Basically it means processing a full JSON file.

https://github.com/buger/jsonparser/blob/master/benchmark/benchmark_large_payload_test.go

Librarytime/opbytes/opallocs/op
encoding/json struct7483368272307
encoding/json interface{}12242712154253395
a8m/djson5100822136822845
pquerna/ffjson3122717792298
mailru/easyjson1541866992288
buger/jsonparser8530800

jsonparser now is a winner, but do not forget that it is way more lightweight parser than ffson or easyjson, and they have to parser all the data, while jsonparser parse only what you need. All ffjson, easysjon and jsonparser have their own parsing code, and does not depend on encoding/json or interface{}, thats one of the reasons why they are so fast. easyjson also use a bit of unsafe package to reduce memory consuption (in theory it can lead to some unexpected GC issue, but i did not tested enough)

Also last benchmark did not included EachKey test, because in this particular case we need to read lot of Array values, and using ArrayEach is more efficient.

Questions and support

All bug-reports and suggestions should go though Github Issues.

Contributing

  1. Fork it
  2. Create your feature branch (git checkout -b my-new-feature)
  3. Commit your changes (git commit -am 'Added some feature')
  4. Push to the branch (git push origin my-new-feature)
  5. Create new Pull Request

Development

All my development happens using Docker, and repo include some Make tasks to simplify development.

  • make build - builds docker image, usually can be called only once
  • make test - run tests
  • make fmt - run go fmt
  • make bench - run benchmarks (if you need to run only single benchmark modify BENCHMARK variable in make file)
  • make profile - runs benchmark and generate 3 files- cpu.out, mem.mprof and benchmark.test binary, which can be used for go tool pprof
  • make bash - enter container (i use it for running go tool pprof above)