Scale your Web Data Gathering: Talk to us and we’ll help scale with high quality Residential Proxies.

IN THIS ARTICLE:

Ready to scale your data?

Subscribe to our newsletter

JSON (JavaScript Object Notation) is the backbone of data sharing on the web. Whether you’re building a frontend app that fetches API responses or a backend scraper that parses local files, knowing how to read JSON in JavaScript is essential.

In this guide, we’ll walk through the different ways to read JSON files in JS, depending on whether you are working in the browser or Node.js. We will also explore how you can integrate proxy support to safely scale your requests using Proxying.io

What is a JSON File?

JSON is a lightweight data format often used for APIs and config files. Here’s what a simple JSON file might look like:

{
  "name": "Proxying.io",
  "type": "proxy service",
  "features": ["residential", "datacenter", "rotating"]
}

You can store this in a file like data.json from web scraping and read it from your code, either in a web environment or in Node.js.

How to Read JSON in the Browser

If you are building a frontend app, you will typically load JSON via HTTP using the fetch API. Here’s a basic example:

fetch('https://example.com/data.json')
  .then((response) => response.json())
  .then((data) => {
    console.log(data.name); // Output: Proxying.io
  })
  .catch((error) => {
    console.error('Error fetching JSON:', error);
  });

This approach is great for loading static JSON files hosted on a server or for getting data from an API.

Using Proxies with Fetch

When scraping data from protected sources, route your request through a proxy. Services like Proxying.io help you bypass rate limits, geo-restrictions, and CAPTCHAs.

You can’t directly configure proxies in the browser’s fetch(), but if you’re using a headless browser or Puppeteer, Proxying.io can easily be integrated into the request layer.

How to Read a JSON File in Node.js

When working in Node.js, you have more flexibility in reading both remote and local JSON files. Here are the most common methods.

Using require() (Only for Static JSON)

Node.js allows you to import the static JSON files directly using require():

const data = require('./data.json');
console.log(data.name); // Output: Proxying.io

This method only works when the JSON file is local and not dynamically changing, and it is synchronous.

Using the fs Module

For more control, use the built-in fs (file system) module to read JSON files asynchronously:

const fs = require('fs');
fs.readFile('./data.json', 'utf8', (err, jsonString) => {
  if (err) {
    console.error('Error reading file:', err);
    return;
  }
  try {
    const data = JSON.parse(jsonString);
    console.log(data.type); // Output: proxy service
  } catch (err) {
    console.error('Error parsing JSON:', err);
  }
});

This is the preferred method when dealing with larger files or when performance matters.

Reading Remote JSON with Axios and Proxying.io

If you need to fetch remote JSON in Node.js and want proxy support, Axios is a great HTTP client that works well with Proxying.io:

npm install axios https-proxy-agent

Then in your code:

const axios = require('axios');
const HttpsProxyAgent = require('https-proxy-agent');
const proxyAgent = new HttpsProxyAgent('http://your-proxy-user:[email protected]:8000');
axios.get('https://example.com/data.json', { httpsAgent: proxyAgent })
  .then((response) => {
    console.log(response.data);
  })
  .catch((error) => {
    console.error('Error fetching remote JSON:', error);
  });

This method is especially useful for scraping data behind bot protection. With Proxying.io’s rotating proxies, you can keep your IP clean while scaling your operations.

Best Practices for Reading JSON

  • Always validate the JSON structure before using it.
  • Use asynchronous functions (fs.promises or fetch) for better performance.
  • When scraping or reading from external sources, use proxies to avoid blocks.
  • Don’t use require() for dynamic files; it caches data.
  • Avoid hardcoding proxy credentials, store them securely via .env files.

Conclusion

Reading JSON in JavaScript is a foundational skill for both frontend and backend development. Whether you’re fetching API responses in the browser or parsing local files in Node.js, JavaScript offers simple and efficient ways to work with JSON data. For developers handling data at scale, especially from protected or geo-restricted sources, integrating proxies like those from Proxying.io adds an essential layer of reliability and security. By combining proper JSON handling techniques with proxy best practices, you can build robust, scalable, and resilient data-driven applications.

Frequently Asked Questions (FAQs)

A JSON file contains structured data in key-value pairs and is commonly used for API responses, configuration files, and data exchange between servers and clients in JavaScript applications.

require() caches the file, so if the JSON changes during runtime, you won’t get the updated content. Use fs.readFile or fs.promises.readFile instead for dynamic data.

You can use the https-proxy-agent library to configure Axios with a Proxying.io proxy endpoint, allowing you to bypass rate limits and geo-blocks when fetching JSON data.

About the author

IN THIS ARTICLE:

Ready to scale your data?

Subscribe to our newsletter

Want to scale your web data gathering with Proxies?

Related articles