Skip to main content
On this page

@std/json

Overview Jump to heading

Utilities for parsing streaming JSON data.

import { JsonStringifyStream } from "@std/json"; import { assertEquals } from "@std/assert"; const stream = ReadableStream.from([{ foo: "bar" }, { baz: 100 }]) .pipeThrough(new JsonStringifyStream()); assertEquals(await Array.fromAsync(stream), [ `{"foo":"bar"}\n`, `{"baz":100}\n` ]); 

Add to your project Jump to heading

deno add jsr:@std/json 

See all symbols in @std/json on

What is JSON streaming? Jump to heading

JSON streaming is a technique for processing JSON data in a continuous flow, rather than loading the entire dataset into memory at once. This is particularly useful for handling large JSON files or real-time data feeds.

Why use @std/json? Jump to heading

To stream JSON in and out to handle large datasets without loading everything in memory.

Examples Jump to heading

Parse concatenated JSON (multiple JSON values back-to-back)

import { ConcatenatedJsonParseStream, } from "@std/json/concatenated-json-parse-stream"; // Stream contains two JSON documents back-to-back without delimiters. const input = ReadableStream.from([ '{"a":1}{', '"b":2}', ]); const parsed = input.pipeThrough(new ConcatenatedJsonParseStream()); console.log(await Array.fromAsync(parsed)); // [{ a: 1 }, { b: 2 }] 

Produce NDJSON (JSON Lines) from objects

import { JsonStringifyStream } from "@std/json/stringify-stream"; const data = [{ id: 1 }, { id: 2 }, { id: 3 }]; // Add a trailing newline after each JSON value for NDJSON const ndjson = ReadableStream .from(data) .pipeThrough(new JsonStringifyStream({ suffix: "\n" })); // Post to a server that accepts application/x-ndjson await fetch("/ingest", { method: "POST", headers: { "content-type": "application/x-ndjson" }, body: ndjson, }); 

Consume NDJSON safely (split lines across chunk boundaries)

import { JsonParseStream } from "@std/json/parse-stream"; // Split by newlines, even if a line is split across chunks function lineSplitter() { let buffer = ""; return new TransformStream<string, string>({ transform(chunk, controller) { buffer += chunk; const lines = buffer.split(/\r?\n/); buffer = lines.pop() ?? ""; // keep last partial line for (const line of lines) if (line) controller.enqueue(line); }, flush(controller) { if (buffer) controller.enqueue(buffer); }, }); } const res = await fetch("/stream.ndjson"); const values = res.body! .pipeThrough(new TextDecoderStream()) .pipeThrough(lineSplitter()) .pipeThrough(new JsonParseStream()); for await (const obj of values) { // Handle each JSON object as it arrives console.log(obj); } 

Transform a JSON stream on the fly

import { JsonParseStream } from "@std/json/parse-stream"; import { JsonStringifyStream } from "@std/json/stringify-stream"; // Incoming objects -> map -> outgoing NDJSON function mapStream<T, U>(map: (t: T) => U) { return new TransformStream<T, U>({ transform: (t, c) => c.enqueue(map(t)) }); } const response = await fetch("/objects.jsonl"); const uppercased = response.body! .pipeThrough(new TextDecoderStream()) .pipeThrough(lineSplitter()) // from previous example .pipeThrough(new JsonParseStream<{ name: string }>()) .pipeThrough(mapStream((o) => ({ name: o.name.toUpperCase() }))) .pipeThrough(new JsonStringifyStream({ suffix: "\n" })); // Pipe to another request, a file, or process further await fetch("/store", { method: "POST", body: uppercased }); 

Tips Jump to heading

  • Use JsonStringifyStream/JsonParseStream to compose pipelines with fetch() and file streams.
  • Be explicit about encoding boundaries—prefer UTF-8 and TextEncoder/TextDecoder when bridging.
  • For NDJSON, use JsonStringifyStream({ suffix: "\n" }) when producing, and split lines before JsonParseStream when consuming.
  • Use ConcatenatedJsonParseStream when your input is a stream of back-to-back JSON values with no separators.

Did you find what you needed?