Back to Articles

gron: Making JSON Greppable with Unix Philosophy

[ View on GitHub ]

gron: Making JSON Greppable with Unix Philosophy

Hook

What if you could search JSON files with grep as easily as you search log files? No query language to learn, no syntax to memorize—just the Unix tools you already know.

Context

Every developer has been there: you’re exploring an unfamiliar API that returns a massive JSON blob, the documentation is incomplete or outdated, and you just need to find where a specific value lives in the nested structure. You could use jq, but that means stopping to look up syntax for the hundredth time. You could pretty-print it and scroll through hundreds of lines. Or you could paste it into a browser-based JSON viewer and click through the hierarchy. All of these options break your flow.

gron solves this by embracing the Unix philosophy: do one thing well, and compose with other tools. Created by Tom Hudson (tomnomnom), gron transforms JSON into a flat, line-based format where each path becomes a discrete assignment statement. Instead of wrestling with nested objects, you get json.contact.email = "mail@tomnomnom.com" on a single line. Now grep, sed, awk, and diff work exactly as you’d expect. The tool has earned over 14,400 GitHub stars because it makes a common task—searching and filtering JSON—trivial instead of tedious.

Technical Insight

raw JSON

object tree

path + value pairs

discrete lines

gron statements

parsed assignments

rebuilt structure

grep/filter

JSON Input

file/URL/stdin

JSON Parser

Go stdlib

Tree Traverser

recursive walk

Assignment Formatter

path → statement

Flat Statements

greppable text

Assignment Lines

filtered/piped

Statement Parser

regex extraction

JSON Reconstructor

path → tree

Valid JSON

stdout

System architecture — auto-generated

At its core, gron is a JSON-to-assignments transformer with a clever reversible design. When you pipe JSON through gron, it parses the structure and outputs JavaScript-style assignment statements for every leaf value. Here’s what happens when you process a simple JSON file:

 gron testdata/two.json
json = {};
json.contact = {};
json.contact.email = "mail@tomnomnom.com";
json.contact.twitter = "@TomNomNom";
json.github = "https://github.com/tomnomnom/";
json.likes = [];
json.likes[0] = "code";
json.likes[1] = "cheese";
json.likes[2] = "meat";
json.name = "Tom";

Notice how every value gets its own line with the complete path from root to leaf. Arrays are represented with bracket notation, and objects create intermediate assignment statements. This format is valid JavaScript that you can execute directly in Node.js, but more importantly, it’s perfectly suited for grep.

The real power emerges when you combine gron with Unix tools and then reverse the transformation. Say you’re exploring the GitHub API and want to extract just the author information from a commit:

 gron "https://api.github.com/repos/tomnomnom/gron/commits?per_page=1" | fgrep "commit.author" | gron --ungron
[
  {
    "commit": {
      "author": {
        "date": "2016-07-02T10:51:21Z",
        "email": "mail@tomnomnom.com",
        "name": "Tom Hudson"
      }
    }
  }
]

This workflow is elegant: gron flattens the JSON, fgrep filters to just the lines containing “commit.author”, and gron --ungron reconstructs valid JSON from those filtered assignments. You’ve extracted a subset of deeply nested data without writing a single jq query.

Architecturally, gron is a single Go binary with zero runtime dependencies. It accepts input from files, URLs (with built-in HTTP support), or stdin, making it trivially composable. The implementation appears to use Go’s standard library JSON parser to traverse the object tree, then generates the assignment statements as it walks. For ungronning, it parses the assignment syntax and rebuilds the JSON structure, handling edge cases like sparse arrays by padding with null values.

The tool also supports a --json output mode that represents gron data as JSON streams—arrays where the first element is the path and the second is the value. This provides an alternative serialization format for tools that prefer structured data over assignment syntax:

 curl -s http://headers.jsontest.com/ | gron --json
[[],{}]
[["Accept"],"*/*"]
[["Host"],"headers.jsontest.com"]
[["User-Agent"],"curl/7.43.0"]

One underappreciated use case is diffing JSON files. Since gron produces sorted, line-based output, standard diff works beautifully:

 diff <(gron two.json) <(gron two-b.json)
3c3
< json.contact.email = "mail@tomnomnom.com";
---
> json.contact.email = "contact@tomnomnom.com";

This immediately shows you exactly what changed without the noise of reformatted braces or reordered keys that plagues direct JSON diffs. The tool even supports a --no-sort flag for performance when you don’t need deterministic ordering.

Gotcha

While gron excels at exploration and filtering, it has deliberate limitations. It’s not designed for complex transformations—if you need to compute sums, reshape structures, or perform conditional logic, jq is the right tool. gron’s sweet spot is extraction and filtering, not computation.

Array handling can produce unexpected results when you filter aggressively. To preserve array indices, gron pads missing elements with null. If you grep out specific array elements and ungron the result, you’ll get a sparse array:

 gron testdata/two.json | grep likes | grep -v cheese | gron --ungron
{
  "likes": [
    "code",
    null,
    "meat"
  ]
}

This is technically correct (it preserves the original indices), but may not be what you want if you’re trying to create a filtered list. You’d need to post-process with jq to compact the array.

Verdict

Use if: You spend time exploring unfamiliar APIs, debugging JSON responses, or need to extract specific fields without learning jq syntax. gron is perfect for DevOps engineers who live in the terminal and prefer composing grep/sed/awk pipelines. It’s also excellent for diffing JSON configuration files or finding all occurrences of a value across nested structures. The bidirectional transformation means you can filter with familiar tools and still produce valid JSON output. Skip if: You need sophisticated JSON transformations, aggregations, or mathematical operations—invest the time to learn jq instead. Also skip it for processing extremely large JSON files where performance may be a concern, or when you need to work with streaming JSON that doesn’t fit in memory. But for quick, scriptable JSON searches that compose naturally with Unix tools, gron is unbeatable.

// ADD TO YOUR README
[![Featured on Starlog](https://starlog.is/api/badge/developer-tools/tomnomnom-gron.svg)](https://starlog.is/api/badge-click/developer-tools/tomnomnom-gron)