Convert Figma logo to code with AI

jprichardson logonode-jsonfile

Easily read/write JSON files.

1,200
317
1,200
5

Top Related Projects

21,334

Simple and fast JSON database

Tiny wrapper around Node streams2 Transform to avoid explicit subclassing noise

rawStream.pipe(JSONStream.parse()).pipe(streamOfObjects)

Quick Overview

jprichardson/node-jsonfile is a lightweight Node.js module for reading and writing JSON files. It simplifies the process of working with JSON data in files, providing a convenient interface for common operations like reading, writing, and appending JSON data.

Pros

  • Simple and easy-to-use API for JSON file operations
  • Supports both synchronous and asynchronous methods
  • Provides options for pretty-printing JSON output
  • Lightweight with minimal dependencies

Cons

  • Limited to JSON file operations only
  • May not be suitable for very large JSON files due to in-memory processing
  • Lacks advanced features like streaming or partial file updates

Code Examples

Reading a JSON file:

const jsonfile = require('jsonfile')

jsonfile.readFile('file.json')
  .then(obj => console.log(obj))
  .catch(error => console.error(error))

Writing a JSON file:

const jsonfile = require('jsonfile')

const obj = {name: 'JP'}
jsonfile.writeFile('file.json', obj, {spaces: 2})
  .then(res => console.log('Write complete'))
  .catch(error => console.error(error))

Appending to a JSON file:

const jsonfile = require('jsonfile')

const obj = {name: 'JP'}
jsonfile.writeFile('file.json', obj, {flag: 'a'})
  .then(res => console.log('Append complete'))
  .catch(error => console.error(error))

Getting Started

To use jprichardson/node-jsonfile in your project, follow these steps:

  1. Install the package:

    npm install jsonfile
    
  2. Import the module in your JavaScript file:

    const jsonfile = require('jsonfile')
    
  3. Use the provided methods to read, write, or append JSON data:

    // Reading a JSON file
    jsonfile.readFile('file.json')
      .then(obj => console.log(obj))
      .catch(error => console.error(error))
    
    // Writing a JSON file
    const obj = {name: 'JP'}
    jsonfile.writeFile('file.json', obj)
      .then(res => console.log('Write complete'))
      .catch(error => console.error(error))
    

Competitor Comparisons

21,334

Simple and fast JSON database

Pros of lowdb

  • Provides a full-fledged database-like API with querying and writing capabilities
  • Supports multiple adapters (e.g., LocalStorage, Memory) for different storage options
  • Offers chainable methods for more expressive and readable code

Cons of lowdb

  • Larger package size and more dependencies compared to node-jsonfile
  • May be overkill for simple JSON file operations
  • Slightly steeper learning curve due to its more extensive API

Code Comparison

node-jsonfile:

const jsonfile = require('jsonfile')

jsonfile.readFile('file.json', (err, obj) => {
  if (err) console.error(err)
  console.log(obj)
})

lowdb:

const low = require('lowdb')
const FileSync = require('lowdb/adapters/FileSync')

const adapter = new FileSync('db.json')
const db = low(adapter)

console.log(db.getState())

Summary

node-jsonfile is a lightweight library focused on reading and writing JSON files, while lowdb provides a more comprehensive solution with database-like features. node-jsonfile is simpler and more straightforward for basic JSON operations, whereas lowdb offers more flexibility and advanced querying capabilities at the cost of increased complexity and package size.

Tiny wrapper around Node streams2 Transform to avoid explicit subclassing noise

Pros of through2

  • Provides a simple and flexible way to create Node.js transform streams
  • Supports both object mode and buffer mode streams
  • Offers better performance for large data processing tasks

Cons of through2

  • Requires more setup and configuration for simple JSON file operations
  • Less intuitive for developers primarily working with JSON files
  • May introduce unnecessary complexity for basic file reading/writing tasks

Code Comparison

through2:

const through2 = require('through2');
const stream = through2(function(chunk, enc, callback) {
  this.push(chunk.toString().toUpperCase());
  callback();
});

node-jsonfile:

const jsonfile = require('jsonfile');
jsonfile.readFile('file.json', (err, obj) => {
  if (err) console.error(err);
  console.log(obj);
});

Summary

through2 is a powerful tool for creating transform streams in Node.js, offering flexibility and performance for complex data processing tasks. However, for simple JSON file operations, node-jsonfile provides a more straightforward and intuitive API. through2 excels in scenarios involving large-scale data manipulation, while node-jsonfile is better suited for quick and easy JSON file reading and writing operations.

rawStream.pipe(JSONStream.parse()).pipe(streamOfObjects)

Pros of JSONStream

  • Designed for streaming large JSON files, allowing efficient processing of huge datasets
  • Supports parsing and stringifying JSON data as streams
  • Provides a more memory-efficient solution for handling large JSON files

Cons of JSONStream

  • More complex API compared to node-jsonfile
  • Requires more setup and configuration for basic JSON operations
  • Less suitable for simple, small-scale JSON file operations

Code Comparison

JSONStream:

var JSONStream = require('JSONStream');
var fs = require('fs');

fs.createReadStream('data.json')
  .pipe(JSONStream.parse('*.name'))
  .on('data', function(data) {
    console.log('name:', data);
  });

node-jsonfile:

const jsonfile = require('jsonfile');

jsonfile.readFile('data.json', (err, obj) => {
  if (err) console.error(err);
  console.log(obj);
});

Summary

JSONStream is better suited for handling large JSON files and streaming operations, while node-jsonfile offers a simpler API for basic JSON file operations. JSONStream provides more flexibility and efficiency for complex JSON processing tasks, but it comes with a steeper learning curve. node-jsonfile is more straightforward for small-scale JSON file handling but may not be as efficient for very large files or streaming scenarios.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Node.js - jsonfile

Easily read/write JSON files in Node.js. Note: this module cannot be used in the browser.

npm Package linux build status windows Build status

Standard JavaScript

Why?

Writing JSON.stringify() and then fs.writeFile() and JSON.parse() with fs.readFile() enclosed in try/catch blocks became annoying.

Installation

npm install --save jsonfile

API


readFile(filename, [options], callback)

options (object, default undefined): Pass in any fs.readFile options or set reviver for a JSON reviver.

  • throws (boolean, default: true). If JSON.parse throws an error, pass this error to the callback. If false, returns null for the object.
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
jsonfile.readFile(file, function (err, obj) {
  if (err) console.error(err)
  console.dir(obj)
})

You can also use this method with promises. The readFile method will return a promise if you do not pass a callback function.

const jsonfile = require('jsonfile')
const file = '/tmp/data.json'
jsonfile.readFile(file)
  .then(obj => console.dir(obj))
  .catch(error => console.error(error))

readFileSync(filename, [options])

options (object, default undefined): Pass in any fs.readFileSync options or set reviver for a JSON reviver.

  • throws (boolean, default: true). If an error is encountered reading or parsing the file, throw the error. If false, returns null for the object.
const jsonfile = require('jsonfile')
const file = '/tmp/data.json'

console.dir(jsonfile.readFileSync(file))

writeFile(filename, obj, [options], callback)

options: Pass in any fs.writeFile options or set replacer for a JSON replacer. Can also pass in spaces, or override EOL string or set finalEOL flag as false to not save the file with EOL at the end.

const jsonfile = require('jsonfile')

const file = '/tmp/data.json'
const obj = { name: 'JP' }

jsonfile.writeFile(file, obj, function (err) {
  if (err) console.error(err)
})

Or use with promises as follows:

const jsonfile = require('jsonfile')

const file = '/tmp/data.json'
const obj = { name: 'JP' }

jsonfile.writeFile(file, obj)
  .then(res => {
    console.log('Write complete')
  })
  .catch(error => console.error(error))

formatting with spaces:

const jsonfile = require('jsonfile')

const file = '/tmp/data.json'
const obj = { name: 'JP' }

jsonfile.writeFile(file, obj, { spaces: 2 }, function (err) {
  if (err) console.error(err)
})

overriding EOL:

const jsonfile = require('jsonfile')

const file = '/tmp/data.json'
const obj = { name: 'JP' }

jsonfile.writeFile(file, obj, { spaces: 2, EOL: '\r\n' }, function (err) {
  if (err) console.error(err)
})

disabling the EOL at the end of file:

const jsonfile = require('jsonfile')

const file = '/tmp/data.json'
const obj = { name: 'JP' }

jsonfile.writeFile(file, obj, { spaces: 2, finalEOL: false }, function (err) {
  if (err) console.log(err)
})

appending to an existing JSON file:

You can use fs.writeFile option { flag: 'a' } to achieve this.

const jsonfile = require('jsonfile')

const file = '/tmp/mayAlreadyExistedData.json'
const obj = { name: 'JP' }

jsonfile.writeFile(file, obj, { flag: 'a' }, function (err) {
  if (err) console.error(err)
})

writeFileSync(filename, obj, [options])

options: Pass in any fs.writeFileSync options or set replacer for a JSON replacer. Can also pass in spaces, or override EOL string or set finalEOL flag as false to not save the file with EOL at the end.

const jsonfile = require('jsonfile')

const file = '/tmp/data.json'
const obj = { name: 'JP' }

jsonfile.writeFileSync(file, obj)

formatting with spaces:

const jsonfile = require('jsonfile')

const file = '/tmp/data.json'
const obj = { name: 'JP' }

jsonfile.writeFileSync(file, obj, { spaces: 2 })

overriding EOL:

const jsonfile = require('jsonfile')

const file = '/tmp/data.json'
const obj = { name: 'JP' }

jsonfile.writeFileSync(file, obj, { spaces: 2, EOL: '\r\n' })

disabling the EOL at the end of file:

const jsonfile = require('jsonfile')

const file = '/tmp/data.json'
const obj = { name: 'JP' }

jsonfile.writeFileSync(file, obj, { spaces: 2, finalEOL: false })

appending to an existing JSON file:

You can use fs.writeFileSync option { flag: 'a' } to achieve this.

const jsonfile = require('jsonfile')

const file = '/tmp/mayAlreadyExistedData.json'
const obj = { name: 'JP' }

jsonfile.writeFileSync(file, obj, { flag: 'a' })

License

(MIT License)

Copyright 2012-2016, JP Richardson jprichardson@gmail.com

NPM DownloadsLast 30 Days