Last week, we learned about making objects iterable in JavaScript. We have also covered iterators and generators in the past. Today, we’ll look at async iterables, which are useful for working with asynchronous data streams. This can sometimes be more performant than loading a large data set into memory all at once. Async iterables differ from their synchronous counterparts in that their values are wrapped in promises. Async iterables have a [Symbol.asyncIterator]() method somewhere along their prototype chain and we iterate over them using the for await...of statement. Because the process of creating async iterables is similar to creating synchronous iterables, we’ll just look at a couple of built-in examples.

Fetching data in the browser

The global fetch() function, which is part of the Fetch API, returns a Response object. The Response interface has a .body property that points to a ReadableStream object. The ReadableStream interface, which is part of the Streams API, implements the async iterable protocol. This means we can asynchronously iterate over the chunks in a stream using the for await...of statement.

async function readData(url) {
const response = await fetch(url);

for await (const chunk of response.body) {
// Do something with each chunk
console.log(chunk);
}
}

Each chunk is a Uint8Array, which is a form of typed array. Decoding the raw binary data is one of the tricky parts. If we know the response body is textual, we can use the .pipeThrough() method of the ReadableStream interface to pipe it through a TextDecoderStream object, which is part of the Encoding API.

async function readData(url) {
const response = await fetch(url);
const textDecoderStream = new TextDecoderStream();
const stream = response.body.pipeThrough(textDecoderStream);

for await (const chunk of stream) {
// Do something with each chunk
console.log(chunk);
}
}

Another tricky part is parsing chunks of JSON data. We can’t guarantee that each chunk will be a complete JSON object. For example, if we try passing the /photos endpoint from JSONPlaceholder to the readData() function, we’ll get about 10 chunks of incomplete JSON data, broken up at seemingly random intervals. Parsing one chunk at a time would require careful programming that is outside the scope of this article. However, we would potentially reap the performance benefit of processing a large amount of data piece by piece, rather than loading it into memory all at once.

For more on the Streams API, I recommend the following guides in the MDN Web Docs:

Note that the .body property of the Response interface has only pointed to a ReadableStream object since version 116 of Chrome and Edge and version 102 of Firefox and Opera. It is not supported in Safari at the time of writing.

Reading files in Node.js

In Node.js, we might wish to read a file line by line. Since version 18.11.0 (the latest LTS version is 18.18.2 at the time of writing), there has been a filehandle.readLines() convenience method that creates a readline interface and streams over a file. The readline interface implements the async iterable protocol, meaning we can asynchronously iterate over each line using the for await...of statement.

import * as fs from "fs/promises";

async function readLines(path) {
const file = await fs.open(path);

for await (const line of file.readLines()) {
// Do something with each line
console.log(line);
}
}

If we needed better support, we could use the fs.createReadStream() method. Because this code uses ECMAScript modules, it should be comfortably supported back to version 14.0.0 (the latest LTS version is 14.21.3). If we were willing to use CommonJS modules, we could push support back to version 10.17.0 (the latest LTS version is 10.24.1).

import * as fs from "fs";
import * as readline from "readline";

async function readLines(path) {
const input = fs.createReadStream(path);
const rl = readline.createInterface({ input });

for await (const line of rl) {
// Do something with each line
console.log(line);
}
}

Summary

  • Async iterables are useful for working with asynchronous data streams.
  • Iterating over a large data set in chunks can sometimes be more performant than loading it into memory all at once.
  • The values of async iterables are wrapped in promises.
  • An object is an async iterable if it has a [Symbol.asyncIterator]() method somewhere along its prototype chain.
  • Async iterables can be asynchronously iterated over using the for await...of statement.