Convert Figma logo to code with AI

tomnomnom logogron

Make JSON greppable!

13,728
325
13,728
66

Top Related Projects

30,058

Command-line JSON processor

18,903

Terminal JSON viewer & processor

1,962

JSON Stream Editor (command line utility)

6,859

json incremental digger

11,663

yq is a portable command-line YAML, JSON, XML, CSV, TOML and properties processor

7,019

Select, put and delete data from JSON, TOML, YAML, XML and CSV files with a single tool. Supports conversion between formats and can be used as a Go package.

Quick Overview

Gron is a command-line tool that transforms JSON into discrete assignments to make it easier to grep for specific data. It breaks down complex JSON structures into simple, greppable lines, allowing users to quickly search and manipulate JSON data using standard Unix tools.

Pros

  • Simplifies searching and filtering of complex JSON structures
  • Integrates well with existing Unix tools like grep, sed, and awk
  • Supports reverse transformation, allowing modified output to be converted back to JSON
  • Lightweight and easy to install

Cons

  • Limited to JSON format; not suitable for other data formats
  • May produce verbose output for large JSON files
  • Requires learning a new syntax for complex queries
  • Not a full-featured JSON manipulation tool; best used in combination with other tools

Getting Started

To install gron on macOS using Homebrew:

brew install gron

For other platforms, you can download the binary from the GitHub releases page or use go get:

go get -u github.com/tomnomnom/gron

Basic usage:

gron example.json | grep "name" | gron --ungron

This command will convert the JSON file to gron format, search for lines containing "name", and then convert the results back to JSON.

Competitor Comparisons

30,058

Command-line JSON processor

Pros of jq

  • More powerful and flexible for complex JSON transformations
  • Supports a wide range of operations, including filtering, mapping, and reducing
  • Has a rich query language for advanced data manipulation

Cons of jq

  • Steeper learning curve due to its complex syntax and features
  • Can be overkill for simple JSON parsing tasks
  • May require more typing for basic operations

Code Comparison

gron example:

gron example.json
json = {};
json.name = "John Doe";
json.age = 30;
json.city = "New York";

jq example:

jq '.' example.json
{
  "name": "John Doe",
  "age": 30,
  "city": "New York"
}

Summary

gron excels at making JSON greppable and easy to search, while jq is more powerful for complex JSON transformations. gron has a simpler syntax and is easier to learn, but jq offers more advanced features for data manipulation. Choose gron for quick JSON inspection and searching, and jq for more complex JSON processing tasks.

18,903

Terminal JSON viewer & processor

Pros of fx

  • More feature-rich, offering interactive mode and advanced querying capabilities
  • Supports multiple input formats (JSON, YAML, TOML) and output formats
  • Provides a more powerful JavaScript-based query language

Cons of fx

  • Steeper learning curve due to more complex syntax and features
  • Larger codebase and dependencies, potentially slower for simple tasks
  • May be overkill for basic JSON processing needs

Code comparison

fx:

fx '.[0].name' < data.json
fx 'x => x.filter(i => i.age > 30)' < data.json

gron:

gron data.json | grep name
gron data.json | grep 'age =' | awk '$NF > 30'

Summary

fx is a more powerful and versatile tool for working with structured data, offering advanced querying and transformation capabilities. It's ideal for complex data manipulation tasks and interactive exploration. gron, on the other hand, excels in simplicity and ease of use, making it perfect for quick JSON flattening and basic filtering using standard Unix tools. The choice between the two depends on the complexity of the task at hand and the user's familiarity with JavaScript-based querying versus Unix command-line tools.

1,962

JSON Stream Editor (command line utility)

Pros of jj

  • Supports both JSON and JSONL formats
  • Offers more advanced querying capabilities, including filtering and transformations
  • Provides a rich set of built-in functions for data manipulation

Cons of jj

  • Steeper learning curve due to more complex syntax
  • Less focused on the specific task of flattening JSON for grepping

Code Comparison

gron example:

echo '{"a": 1, "b": {"c": 2}}' | gron
json = {};
json.a = 1;
json.b = {};
json.b.c = 2;

jj example:

echo '{"a": 1, "b": {"c": 2}}' | jj -r '.'
{
  "a": 1,
  "b": {
    "c": 2
  }
}

Summary

gron is designed specifically for flattening JSON into greppable format, making it simpler for basic tasks. jj, on the other hand, offers more powerful querying and manipulation capabilities, but with a more complex syntax. gron excels in simplicity and ease of use for its specific purpose, while jj provides greater flexibility and functionality for working with JSON data.

6,859

json incremental digger

Pros of jid

  • Interactive JSON exploration with real-time filtering
  • Supports incremental search and navigation through complex JSON structures
  • Allows for easy extraction of specific data points from large JSON files

Cons of jid

  • Requires user interaction, less suitable for automated scripts
  • Limited to JSON format, while gron can handle other data types
  • May be slower for processing very large files compared to gron's static output

Code Comparison

jid example:

echo '{"foo": {"bar": "baz"}}' | jid

gron example:

echo '{"foo": {"bar": "baz"}}' | gron

Key Differences

  • jid provides an interactive interface for exploring JSON data
  • gron flattens JSON into assignable JavaScript-like statements
  • jid is better for ad-hoc exploration, while gron excels in scripting and automation

Use Cases

  • Use jid for interactive debugging and exploring unfamiliar JSON structures
  • Choose gron for creating reusable scripts or processing JSON in pipelines

Both tools offer unique approaches to JSON manipulation, with jid focusing on interactivity and gron on simplification and automation. The choice between them depends on the specific task and workflow requirements.

11,663

yq is a portable command-line YAML, JSON, XML, CSV, TOML and properties processor

Pros of yq

  • Supports multiple data formats (YAML, JSON, XML, CSV, TSV)
  • More powerful querying and manipulation capabilities
  • Actively maintained with frequent updates

Cons of yq

  • Larger binary size and more dependencies
  • Steeper learning curve due to more complex syntax
  • Slower for simple tasks compared to gron

Code Comparison

gron:

gron example.json | grep name

yq:

yq '.name' example.json

Both tools can be used to extract specific fields from JSON data, but yq offers more advanced querying capabilities:

gron:

gron example.json | grep user.age | sed 's/^json.//' | sed 's/ = /: /'

yq:

yq '.user.age' example.json

While gron flattens JSON into assignable expressions, yq provides a more concise syntax for querying nested structures. gron excels in simplicity and ease of use for basic tasks, while yq offers more powerful features for complex data manipulation across multiple formats.

7,019

Select, put and delete data from JSON, TOML, YAML, XML and CSV files with a single tool. Supports conversion between formats and can be used as a Go package.

Pros of dasel

  • Supports multiple data formats (JSON, YAML, TOML, XML) vs. gron's JSON-only focus
  • Allows both reading and writing data, while gron is primarily for flattening/viewing
  • Offers a more versatile query syntax for complex data manipulations

Cons of dasel

  • May have a steeper learning curve due to its more complex functionality
  • Potentially slower for simple JSON flattening tasks compared to gron's specialized approach
  • Less focused on the specific use case of making JSON greppable

Code Comparison

gron:

gron "https://api.github.com/repos/tomnomnom/gron/commits?per_page=1" | grep "commit.author"

dasel:

curl -s "https://api.github.com/repos/tomnomnom/gron/commits?per_page=1" | dasel -r json ".[] | select(.commit.author)"

Both tools aim to simplify working with structured data, but they approach the task differently. gron focuses on flattening JSON for easy grepping, while dasel offers a more comprehensive solution for querying and manipulating various data formats. The choice between them depends on the specific use case and the complexity of the data processing required.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

gron

Build Status

Make JSON greppable!

gron transforms JSON into discrete assignments to make it easier to grep for what you want and see the absolute 'path' to it. It eases the exploration of APIs that return large blobs of JSON but have terrible documentation.

▶ gron "https://api.github.com/repos/tomnomnom/gron/commits?per_page=1" | fgrep "commit.author"
json[0].commit.author = {};
json[0].commit.author.date = "2016-07-02T10:51:21Z";
json[0].commit.author.email = "mail@tomnomnom.com";
json[0].commit.author.name = "Tom Hudson";

gron can work backwards too, enabling you to turn your filtered data back into JSON:

▶ gron "https://api.github.com/repos/tomnomnom/gron/commits?per_page=1" | fgrep "commit.author" | gron --ungron
[
  {
    "commit": {
      "author": {
        "date": "2016-07-02T10:51:21Z",
        "email": "mail@tomnomnom.com",
        "name": "Tom Hudson"
      }
    }
  }
]

Disclaimer: the GitHub API has fantastic documentation, but it makes for a good example.

Installation

gron has no runtime dependencies. You can just download a binary for Linux, Mac, Windows or FreeBSD and run it. Put the binary in your $PATH (e.g. in /usr/local/bin) to make it easy to use:

▶ tar xzf gron-linux-amd64-0.1.5.tgz
▶ sudo mv gron /usr/local/bin/

If you're a Mac user you can also install gron via brew:

▶ brew install gron

Or if you're a Go user you can use go install:

▶ go install github.com/tomnomnom/gron@latest

It's recommended that you alias ungron or norg (or both!) to gron --ungron. Put something like this in your shell profile (e.g. in ~/.bashrc):

alias norg="gron --ungron"
alias ungron="gron --ungron"

Or you could create a shell script in your $PATH named ungron or norg to affect all users:

gron --ungron "$@"

Usage

Get JSON from a file:

▶ gron testdata/two.json 
json = {};
json.contact = {};
json.contact.email = "mail@tomnomnom.com";
json.contact.twitter = "@TomNomNom";
json.github = "https://github.com/tomnomnom/";
json.likes = [];
json.likes[0] = "code";
json.likes[1] = "cheese";
json.likes[2] = "meat";
json.name = "Tom";

From a URL:

▶ gron http://headers.jsontest.com/
json = {};
json.Host = "headers.jsontest.com";
json["User-Agent"] = "gron/0.1";
json["X-Cloud-Trace-Context"] = "6917a823919477919dbc1523584ba25d/11970839830843610056";

Or from stdin:

▶ curl -s http://headers.jsontest.com/ | gron
json = {};
json.Accept = "*/*";
json.Host = "headers.jsontest.com";
json["User-Agent"] = "curl/7.43.0";
json["X-Cloud-Trace-Context"] = "c70f7bf26661c67d0b9f2cde6f295319/13941186890243645147";

Grep for something and easily see the path to it:

▶ gron testdata/two.json | grep twitter
json.contact.twitter = "@TomNomNom";

gron makes diffing JSON easy too:

▶ diff <(gron two.json) <(gron two-b.json)
3c3
< json.contact.email = "mail@tomnomnom.com";
---
> json.contact.email = "contact@tomnomnom.com";

The output of gron is valid JavaScript:

▶ gron testdata/two.json > tmp.js
▶ echo "console.log(json);" >> tmp.js
▶ nodejs tmp.js
{ contact: { email: 'mail@tomnomnom.com', twitter: '@TomNomNom' },
  github: 'https://github.com/tomnomnom/',
  likes: [ 'code', 'cheese', 'meat' ],
  name: 'Tom' }

It's also possible to obtain the gron output as JSON stream via the --json switch:

▶ curl -s http://headers.jsontest.com/ | gron --json
[[],{}]
[["Accept"],"*/*"]
[["Host"],"headers.jsontest.com"]
[["User-Agent"],"curl/7.43.0"]
[["X-Cloud-Trace-Context"],"c70f7bf26661c67d0b9f2cde6f295319/13941186890243645147"]

ungronning

gron can also turn its output back into JSON:

▶ gron testdata/two.json | gron -u
{
  "contact": {
    "email": "mail@tomnomnom.com",
    "twitter": "@TomNomNom"
  },
  "github": "https://github.com/tomnomnom/",
  "likes": [
    "code",
    "cheese",
    "meat"
  ],
  "name": "Tom"
}

This means you use can use gron with grep and other tools to modify JSON:

▶ gron testdata/two.json | grep likes | gron --ungron
{
  "likes": [
    "code",
    "cheese",
    "meat"
  ]
}

or

▶ gron --json testdata/two.json | grep likes | gron  --json --ungron
{
  "likes": [
    "code",
    "cheese",
    "meat"
  ]
}

To preserve array keys, arrays are padded with null when values are missing:

▶ gron testdata/two.json | grep likes | grep -v cheese
json.likes = [];
json.likes[0] = "code";
json.likes[2] = "meat";
▶ gron testdata/two.json | grep likes | grep -v cheese | gron --ungron
{
  "likes": [
    "code",
    null,
    "meat"
  ]
}

If you get creative you can do some pretty neat tricks with gron, and then ungron the output back into JSON.

Get Help

▶ gron --help
Transform JSON (from a file, URL, or stdin) into discrete assignments to make it greppable

Usage:
  gron [OPTIONS] [FILE|URL|-]

Options:
  -u, --ungron     Reverse the operation (turn assignments back into JSON)
  -v, --values     Print just the values of provided assignments
  -c, --colorize   Colorize output (default on tty)
  -m, --monochrome Monochrome (don't colorize output)
  -s, --stream     Treat each line of input as a separate JSON object
  -k, --insecure   Disable certificate validation
  -j, --json       Represent gron data as JSON stream
      --no-sort    Don't sort output (faster)
      --version    Print version information

Exit Codes:
  0	OK
  1	Failed to open file
  2	Failed to read input
  3	Failed to form statements
  4	Failed to fetch URL
  5	Failed to parse statements
  6	Failed to encode JSON

Examples:
  gron /tmp/apiresponse.json
  gron http://jsonplaceholder.typicode.com/users/1 
  curl -s http://jsonplaceholder.typicode.com/users/1 | gron
  gron http://jsonplaceholder.typicode.com/users/1 | grep company | gron --ungron

FAQ

Wasn't this written in PHP before?

Yes it was! The original version is preserved here for posterity.

Why the change to Go?

Mostly to remove PHP as a dependency. There's a lot of people who work with JSON who don't have PHP installed.

Why shouldn't I just use jq?

jq is awesome, and a lot more powerful than gron, but with that power comes complexity. gron aims to make it easier to use the tools you already know, like grep and sed.

gron's primary purpose is to make it easy to find the path to a value in a deeply nested JSON blob when you don't already know the structure; much of jq's power is unlocked only once you know that structure.