Loading...
Over the coming months, Cloudflare Workers will start to roll out built-in compatibility with Node.js core APIs as part of an effort to support increased compatibility across JavaScript runtimes.
We are happy to announce today that the first of these Node.js APIs – AsyncLocalStorage
, EventEmitter
, Buffer
, assert
, and parts of util
– are now available for use. These APIs are provided directly by the open-source Cloudflare Workers runtime, with no need to bundle polyfill implementations into your own code.
These new APIs are available today — start using them by enabling the nodejs_compat
compatibility flag in your Workers.
Async Context Tracking with the AsyncLocalStorage API
The AsyncLocalStorage
API provides a way to track context across asynchronous operations. It allows you to pass a value through your program, even across multiple layers of asynchronous code, without having to pass a context value between operations.
Consider an example where we want to add debug logging that works through multiple layers of an application, where each log contains the ID of the current request. Without AsyncLocalStorage, it would be necessary to explicitly pass the request ID down through every function call that might invoke the logging function:
function logWithId(id, state) {
console.log(`${id} - ${state}`);
}
function doSomething(id) {
// We don't actually use id for anything in this function!
// It's only here because logWithId needs it.
logWithId(id, "doing something");
setTimeout(() => doSomethingElse(id), 10);
}
function doSomethingElse(id) {
logWithId(id, "doing something else");
}
let idSeq = 0;
export default {
async fetch(req) {
const id = idSeq++;
doSomething(id);
logWithId(id, 'complete');
return new Response("ok");
}
}
While this approach works, it can be cumbersome to coordinate correctly, especially as the complexity of an application grows. Using AsyncLocalStorage
this becomes significantly easier by eliminating the need to explicitly pass the context around. Our application functions (doSomething
and doSomethingElse
in this case) never need to know about the request ID at all while the logWithId
function does exactly what we need it to:
import { AsyncLocalStorage } from 'node:async_hooks';
const requestId = new AsyncLocalStorage();
function logWithId(state) {
console.log(`${requestId.getStore()} - ${state}`);
}
function doSomething() {
logWithId("doing something");
setTimeout(() => doSomethingElse(), 10);
}
function doSomethingElse() {
logWithId("doing something else");
}
let idSeq = 0;
export default {
async fetch(req) {
return requestId.run(idSeq++, () => {
doSomething();
logWithId('complete');
return new Response("ok");
});
}
}
With the nodejs_compat
compatibility flag enabled, import statements are used to access specific APIs. The Workers implementation of these APIs requires the use of the node: specifier prefix that was introduced recently in Node.js (e.g. node:async_hooks
, node:events
, etc)
We implement a subset of the AsyncLocalStorage
API in order to keep things as simple as possible. Specifically, we've chosen not to support the enterWith()
and disable()
APIs that are found in Node.js implementation simply because they make async context tracking more brittle and error prone.
Conceptually, at any given moment within a worker, there is a current "Asynchronous Context Frame", which consists of a map of storage cells, each holding a store value for a specific AsyncLocalStorage
instance. Calling asyncLocalStorage.run(...)
causes a new frame to be created, inheriting the storage cells of the current frame, but using the newly provided store value for the cell associated with asyncLocalStorage
.
const als1 = new AsyncLocalStorage();
const als2 = new AsyncLocalStorage();
// Code here runs in the root frame. There are two storage cells,
// one for als1, and one for als2. The store value for each is
// undefined.
als1.run(123, () => {
// als1.run(...) creates a new frame (1). The store value for als1
// is set to 123, the store value for als2 is still undefined.
// This new frame is set to "current".
als2.run(321, () => {
// als2.run(...) creates another new frame (2). The store value
// for als1 is still 123, the store value for als2 is set to 321.
// This new frame is set to "current".
console.log(als1.getStore(), als2.getStore());
});
// Frame (1) is restored as the current. The store value for als1
// is still 123, but the store value for als2 is undefined again.
});
// The root frame is restored as the current. The store values for
// both als1 and als2 are both undefined again.
Whenever an asynchronous operation is initiated in JavaScript, for example, creating a new JavaScript promise, scheduling a timer, etc, the current frame is captured and associated with that operation, allowing the store values at the moment the operation was initialized to be propagated and restored as needed.
const als = new AsyncLocalStorage();
const p1 = als.run(123, () => {
return promise.resolve(1).then(() => console.log(als.getStore());
});
const p2 = promise.resolve(1);
const p3 = als.run(321, () => {
return p2.then(() => console.log(als.getStore()); // prints 321
});
als.run('ABC', () => setInterval(() => {
// prints "ABC" to the console once a second…
setInterval(() => console.log(als.getStore(), 1000);
});
als.run('XYZ', () => queueMicrotask(() => {
console.log(als.getStore()); // prints "XYZ"
}));
Note that for unhandled promise rejections, the "unhandledrejection
" event will automatically propagate the context that is associated with the promise that was rejected. This behavior is different from other types of events emitted by EventTarget
implementations, which will propagate whichever frame is current when the event is emitted.
const asyncLocalStorage = new AsyncLocalStorage();
asyncLocalStorage.run(123, () => Promise.reject('boom'));
asyncLocalStorage.run(321, () => Promise.reject('boom2'));
addEventListener('unhandledrejection', (event) => {
// prints 123 for the first unhandled rejection ('boom'), and
// 321 for the second unhandled rejection ('boom2')
console.log(asyncLocalStorage.getStore());
});
Workers can use the AsyncLocalStorage.snapshot()
method to create their own objects that capture and propagate the context:
const asyncLocalStorage = new AsyncLocalStorage();
class MyResource {
#runInAsyncFrame = AsyncLocalStorage.snapshot();
doSomething(...args) {
return this.#runInAsyncFrame((...args) => {
console.log(asyncLocalStorage.getStore());
}, ...args);
}
}
const resource1 = asyncLocalStorage.run(123, () => new MyResource());
const resource2 = asyncLocalStorage.run(321, () => new MyResource());
resource1.doSomething(); // prints 123
resource2.doSomething(); // prints 321
For more, refer to the Node.js documentation about the AsyncLocalStorage
API.
There is currently an effort underway to add a new AsyncContext mechanism (inspired by AsyncLocalStorage
) to the JavaScript language itself. While it is still early days for the TC-39 proposal, there is good reason to expect it to progress through the committee. Once it does, we look forward to being able to make it available in the Cloudflare Workers platform. We expect our implementation of AsyncLocalStorage
to be compatible with this new API.
The proposal for AsyncContext provides an excellent set of examples and description of the motivation of why async context tracking is useful.
Events with EventEmitter
The EventEmitter API is one of the most fundamental Node.js APIs and is critical to supporting many other higher level APIs, including streams, crypto, net, and more. An EventEmitter is an object that emits named events that cause listeners to be called.
import { EventEmitter } from 'node:events';
const emitter = new EventEmitter();
emitter.on('hello', (...args) => {
console.log(...args);
});
emitter.emit('hello', 1, 2, 3);
The implementation in the Workers runtime fully supports the entire Node.js EventEmitter API including the captureRejections option that allows improved handling of async functions as event handlers:
const emitter = new EventEmitter({ captureRejections: true });
emitter.on('hello', async (...args) => {
throw new Error('boom');
});
emitter.on('error', (err) => {
// the async promise rejection is emitted here!
});
Please refer to the Node.js documentation for more details on the use of the EventEmitter
API: https://nodejs.org/dist/latest-v19.x/docs/api/events.html#events.
Buffer
The Buffer
API in Node.js predates the introduction of the standard TypedArray and DataView APIs in JavaScript by many years and has persisted as one of the most commonly used Node.js APIs for manipulating binary data. Today, every Buffer instance extends from the standard Uint8Array class but adds a range of unique capabilities such as built-in base64 and hex encoding/decoding, byte-order manipulation, and encoding-aware substring searching.
import { Buffer } from 'node:buffer';
const buf = Buffer.from('hello world', 'utf8');
console.log(buf.toString('hex'));
// Prints: 68656c6c6f20776f726c64
console.log(buf.toString('base64'));
// Prints: aGVsbG8gd29ybGQ=
Because a Buffer extends from Uint8Array, it can be used in any workers API that currently accepts Uint8Array, such as creating a new Response:
const response = new Response(Buffer.from("hello world"));
Or interacting with streams:
const writable = getWritableStreamSomehow();
const writer = writable.getWriter();
writer.write(Buffer.from("hello world"));
Please refer to the Node.js documentation for more details on the use of the Buffer API: https://nodejs.org/dist/latest-v19.x/docs/api/buffer.html.
Assertions
The assert module in Node.js provides a number of useful assertions that are useful when building tests.
import {
strictEqual,
deepStrictEqual,
ok,
doesNotReject,
} from 'node:assert';
strictEqual(1, 1); // ok!
strictEqual(1, "1"); // fails! throws AssertionError
deepStrictEqual({ a: { b: 1 }}, { a: { b: 1 }});// ok!
deepStrictEqual({ a: { b: 1 }}, { a: { b: 2 }});// fails! throws AssertionError
ok(true); // ok!
ok(false); // fails! throws AssertionError
await doesNotReject(async () => {}); // ok!
await doesNotReject(async () => { throw new Error('boom') }); // fails! throws AssertionError
In the Workers implementation of assert, all assertions run in what Node.js calls the "strict assertion mode", which means that non-strict methods behave like their corresponding strict methods. For instance, deepEqual()
will behave like deepStrictEqual()
.
Please refer to the Node.js documentation for more details on the use of the assertion API: https://nodejs.org/dist/latest-v19.x/docs/api/assert.html.
Promisify/Callbackify
The promisify
and callbackify APIs in Node.js provide a means of bridging between a Promise-based programming model and a callback-based model.
The promisify
method allows taking a Node.js-style callback function and converting it into a Promise-returning async function:
import { promisify } from 'node:util';
function foo(args, callback) {
try {
callback(null, 1);
} catch (err) {
// Errors are emitted to the callback via the first argument.
callback(err);
}
}
const promisifiedFoo = promisify(foo);
await promisifiedFoo(args);
Similarly, callbackify converts a Promise-returning async function into a Node.js-style callback function:
import { callbackify } from 'node:util';
async function foo(args) {
throw new Error('boom');
}
const callbackifiedFoo = callbackify(foo);
callbackifiedFoo(args, (err, value) => {
if (err) throw err;
});
Together these utilities make it easy to properly handle all of the generally tricky nuances involved with properly bridging between callbacks and promises.
Please refer to the Node.js documentation for more information on how to use these APIs: https://nodejs.org/dist/latest-v19.x/docs/api/util.html#utilcallbackifyoriginal, https://nodejs.org/dist/latest-v19.x/docs/api/util.html#utilpromisifyoriginal.
Type brand-checking with util.types
The util.types API provides a reliable and generally more efficient way of checking that values are instances of various built-in types.
import { types } from 'node:util';
types.isAnyArrayBuffer(new ArrayBuffer()); // Returns true
types.isAnyArrayBuffer(new SharedArrayBuffer()); // Returns true
types.isArrayBufferView(new Int8Array()); // true
types.isArrayBufferView(Buffer.from('hello world')); // true
types.isArrayBufferView(new DataView(new ArrayBuffer(16))); // true
types.isArrayBufferView(new ArrayBuffer()); // false
function foo() {
types.isArgumentsObject(arguments); // Returns true
}
types.isAsyncFunction(function foo() {}); // Returns false
types.isAsyncFunction(async function foo() {}); // Returns true
// .. and so on
Please refer to the Node.js documentation for more information on how to use the type check APIs: https://nodejs.org/dist/latest-v19.x/docs/api/util.html#utiltypes. The workers implementation currently does not provide implementations of the util.types.isExternal()
, util.types.isProxy()
, util.types.isKeyObject()
, or util.type.isWebAssemblyCompiledModule()
APIs.
What's next
Keep your eyes open for more Node.js core APIs coming to Cloudflare Workers soon! We currently have implementations of the string decoder, streams and crypto APIs in active development. These will be introduced into the workers runtime incrementally over time and any worker using the nodejs_compat
compatibility flag will automatically pick up the new modules as they are added.
We protect entire corporate networks, help customers build Internet-scale applications efficiently, accelerate any website or Internet application, ward off DDoS attacks, keep hackers at bay, and can help you on your journey to Zero Trust.
Visit 1.1.1.1 from any device to get started with our free app that makes your Internet faster and safer.
To learn more about our mission to help build a better Internet, start here. If you're looking for a new career direction, check out our open positions.