How to handle large data sets with Axios?

Fri Apr 14 2023

3 min read

As more and more data is being generated every day, efficient handling of large data sets has become increasingly important for developers.

Axios, a popular HTTP client library for JavaScript, provides a powerful and flexible interface for fetching and manipulating data.

Handling large data sets with Axios

When it comes to handling large data sets with Axios, there are a few different approaches you can take. One of the most effective ways is to use streams.

Using streams allows you to fetch data in chunks rather than loading the entire response into memory at once. This can help you avoid running into memory issues when working with particularly large data sets.

Using Axios with streams

Let's dive into how you can use streams with Axios to fetch and work with large data sets.

Step #1: Create a readable stream

First, create a readable stream using Node.js built-in stream.Readable class. This stream will be used to receive data from the server in chunks.

js
const { Readable } = require('stream');
const stream = new Readable({
read() {}
});

Step #2: Make a request with a stream response

Make a request to the server with the responseType option set to stream. This tells Axios to return a stream instead of a fully buffered response.

js
const response = await axios.get('/large-data-set', {
responseType: 'stream'
});

Step #3: Pipe response to readable stream

Next, pipe the response data to the readable stream using Node.js built-in pipe() method. This will push data from the response to the stream in chunks.

js
response.data.pipe(stream);

Step #4: Consume data from readable stream

Finally, consume data from the readable stream by listening for the data event. This event will be emitted each time a chunk of data is received from the server.

js
stream.on('data', chunk => {
console.log(chunk.toString());
});

Handling errors

When fetching large data sets, it's important to handle errors properly to avoid hanging requests or crashing the application. Here are some best practices for handling errors with Axios:

Use a try-catch block:

Wrap your Axios request in a try-catch block to catch any errors that might occur during the request.

js
try {
const response = await axios.get('/large-data-set');
responseType: 'stream'
console.log(response.data);
} catch (error) {
console.error(error);
}

Handle timeout errors

If a request takes longer than the specified timeout value, Axios will throw a Timeout error. You can handle this error by checking if it's an instance of Timeout and retrying the request if necessary.

js
try {
const response = await axios.get('/large-data-set', {
timeout: 5000 // 5 seconds
});
console.log(response.data);
} catch (error) {t
if (error instanceof axios.TimeoutError) {
console.error('Request timed out:', error);
// retry request
} else {
console.error(error);
}
}

Wrap up

When dealing with particularly large data sets, streaming responses can be especially useful to avoid running into memory issues. Using Axios to handle the stream makes it quite easy to fetch large data sets.

If you want to find some APIs to test Axios with large data sets, we have written a guide on world’s largest API hub that offers thousands of APIs.

Loading component...