Tracing garbage collection
This guide will go through the fundamentals of garbage collection traces.
By the end of this guide, you'll be able to:
- Enable traces in your Node.js application
- Interpret traces
- Identify potential memory issues in your Node.js application
There's a lot to learn about how the garbage collector works, but if you learn one thing it's that when GC is running, your code is not.
You may want to know how often and long the garbage collection runs, and what is the outcome.
Setup
For the proposal of this guide, we'll use this script:
// script.mjs
import module "node:os"
os from 'node:os';
let let len: number
len = 1_000_000;
const const entries: Set<any>
entries = new var Set: SetConstructor
new <any>(iterable?: Iterable<any> | null | undefined) => Set<any> (+1 overload)
Set();
function function addEntry(): void
addEntry() {
const const entry: {
timestamp: number;
memory: number;
totalMemory: number;
uptime: number;
}
entry = {
timestamp: number
timestamp: var Date: DateConstructor
Enables basic storage and retrieval of dates and times.Date.DateConstructor.now(): number
Returns the number of milliseconds elapsed since midnight, January 1, 1970 Universal Coordinated Time (UTC).now(),
memory: number
memory: module "node:os"
os.function freemem(): number
Returns the amount of free system memory in bytes as an integer.freemem(),
totalMemory: number
totalMemory: module "node:os"
os.function totalmem(): number
Returns the total amount of system memory in bytes as an integer.totalmem(),
uptime: number
uptime: module "node:os"
os.function uptime(): number
Returns the system uptime in number of seconds.uptime(),
};
const entries: Set<any>
entries.Set<any>.add(value: any): Set<any>
Appends a new element with a specified value to the end of the Set.add(const entry: {
timestamp: number;
memory: number;
totalMemory: number;
uptime: number;
}
entry);
}
function function summary(): void
summary() {
var console: Console
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v22.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v22.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v22.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```console.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v22.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v22.x/api/util.html#utilformatformat-args) for more information.log(`Total: ${const entries: Set<any>
entries.Set<any>.size: number
size} entries`);
}
// execution
(() => {
while (let len: number
len > 0) {
function addEntry(): void
addEntry();
var process: NodeJS.Process
process.NodeJS.Process.stdout: NodeJS.WriteStream & {
fd: 1;
}
The `process.stdout` property returns a stream connected to`stdout` (fd `1`). It is a `net.Socket` (which is a `Duplex` stream) unless fd `1` refers to a file, in which case it is
a `Writable` stream.
For example, to copy `process.stdin` to `process.stdout`:
```js
import { stdin, stdout } from 'node:process';
stdin.pipe(stdout);
```
`process.stdout` differs from other Node.js streams in important ways. See `note on process I/O` for more information.stdout.Socket.write(buffer: Uint8Array | string, cb?: (err?: Error | null) => void): boolean (+1 overload)
Sends data on the socket. The second parameter specifies the encoding in the
case of a string. It defaults to UTF8 encoding.
Returns `true` if the entire data was flushed successfully to the kernel
buffer. Returns `false` if all or part of the data was queued in user memory.`'drain'` will be emitted when the buffer is again free.
The optional `callback` parameter will be executed when the data is finally
written out, which may not be immediately.
See `Writable` stream `write()` method for more
information.write(`~~> ${let len: number
len} entries to record\r`);
let len: number
len--;
}
function summary(): void
summary();
})();
Even if the leak is evident here, finding the source of a leak could be cumbersome in the context of a real-world application.
Running with garbage collection traces
You can see traces for garbage collection in console output of your process
using the --trace-gc
flag.
$ node --trace-gc script.mjs
Note: you can find the source code of this exercise in the Node.js Diagnostics repository.
It should output something like:
[39067:0x158008000] 2297 ms: Scavenge 117.5 (135.8) -> 102.2 (135.8) MB, 0.8 / 0.0 ms (average mu = 0.994, current mu = 0.994) allocation failure
[39067:0x158008000] 2375 ms: Scavenge 120.0 (138.3) -> 104.7 (138.3) MB, 0.9 / 0.0 ms (average mu = 0.994, current mu = 0.994) allocation failure
[39067:0x158008000] 2453 ms: Scavenge 122.4 (140.8) -> 107.1 (140.8) MB, 0.7 / 0.0 ms (average mu = 0.994, current mu = 0.994) allocation failure
[39067:0x158008000] 2531 ms: Scavenge 124.9 (143.3) -> 109.6 (143.3) MB, 0.7 / 0.0 ms (average mu = 0.994, current mu = 0.994) allocation failure
[39067:0x158008000] 2610 ms: Scavenge 127.1 (145.5) -> 111.8 (145.5) MB, 0.7 / 0.0 ms (average mu = 0.994, current mu = 0.994) allocation failure
[39067:0x158008000] 2688 ms: Scavenge 129.6 (148.0) -> 114.2 (148.0) MB, 0.8 / 0.0 ms (average mu = 0.994, current mu = 0.994) allocation failure
[39067:0x158008000] 2766 ms: Scavenge 132.0 (150.5) -> 116.7 (150.5) MB, 1.1 / 0.0 ms (average mu = 0.994, current mu = 0.994) allocation failure
Total: 1000000 entries
Hard to read? Maybe we should pass in review a few concepts
and explain the outputs of the --trace-gc
flag.
Examining a trace with --trace-gc
The --trace-gc
(or --trace_gc
, either is fine) flag outputs all garbage collection
events in the console.
The composition of each line can be described as:
[13973:0x110008000] 44 ms: Scavenge 2.4 (3.2) -> 2.0 (4.2) MB, 0.5 / 0.0 ms (average mu = 1.000, current mu = 1.000) allocation failure
Token value | Interpretation |
---|---|
13973 | PID of the running process |
0x110008000 | Isolate (JS heap instance) |
44 ms | The time since the process started in ms |
Scavenge | Type / Phase of GC |
2.4 | Heap used before GC in MB |
(3.2) | Total heap before GC in MB |
2.0 | Heap used after GC in MB |
(4.2) | Total heap after GC in MB |
0.5 / 0.0 ms (average mu = 1.000, current mu = 1.000) | Time spent in GC in ms |
allocation failure | Reason for GC |
We'll only focus on two events here:
- Scavenge
- Mark-sweep
The heap is divided into spaces. Amongst these, we have a space called the "new" space and another one called the "old" space.
👉 In reality, the heap structure is a bit different, but we'll stick to a simpler version for this article. If you want more details we encourage you to look at this talk of Peter Marshall about Orinoco.
Scavenge
Scavenge is the name of an algorithm that will perform garbage collection into new space. The new space is where objects are created. The new space is designed to be small and fast for garbage collection.
Let's imagine a Scavenge scenario:
- we allocated
A
,B
,C
&D
.| A | B | C | D | <unallocated> |
- we want to allocate
E
- not enough space, the memory is exhausted
- then, a (garbage) collection is triggered
- dead objects are collected
- living object will stay
- assuming
B
andD
were dead| A | C | <unallocated> |
- now we can allocate
E
| A | C | E | <unallocated> |
v8 will promote objects, not garbage collected after two Scavenge operations to the old space.
👉 Full Scavenge scenario
Mark-sweep
Mark-sweep is used to collect objects from old space. The old space is where objects that survived the new space are living.
This algorithm is composed of two phases:
- Mark: Will mark still alive objects as black and others as white.
- Sweep: Scans for white objects and converts them to free spaces.
👉 In fact, the Mark and Sweep steps are a bit more elaborate. Please read this document for more details.
--trace-gc
in action
Memory leak
Now, if you return quickly to the previous terminal window:
you will see many Mark-sweep
events in the console.
We also see that the amount of memory collected after
the event is insignificant.
Now that we are experts in garbage collection! What could we deduce?
We probably have a memory leak! But how could we be sure of that? (Reminder: it is pretty apparent in this example, but what about a real-world application?)
But how could we spot the context?
How to get the context of bad allocations
- Suppose we observe that the old space is continuously increasing.
- Reduce
--max-old-space-size
such that the total heap is closer to the limit - Run the program until you hit the out of memory.
- The produced log shows the failing context.
- If it hits OOM, increment the heap size by ~10% and repeat a few times. If the same pattern is observed, it indicates a memory leak.
- If there is no OOM, then freeze the heap size to that value - A packed heap reduces memory footprint and computation latency.
For example, try to run script.mjs
with the following command:
node --trace-gc --max-old-space-size=50 script.mjs
You should experience an OOM:
[...]
<--- Last few GCs --->
[40928:0x148008000] 509 ms: Mark-sweep 46.8 (65.8) -> 40.6 (77.3) MB, 6.4 / 0.0 ms (+ 1.4 ms in 11 steps since start of marking, biggest step 0.2 ms, walltime since start of marking 24 ms) (average mu = 0.977, current mu = 0.977) finalize incrementa[40928:0x148008000] 768 ms: Mark-sweep 56.3 (77.3) -> 47.1 (83.0) MB, 35.9 / 0.0 ms (average mu = 0.927, current mu = 0.861) allocation failure scavenge might not succeed
<--- JS stacktrace --->
FATAL ERROR: Reached heap limit Allocation failed - JavaScript heap out of memory [...]
Now, try to it for 100mb:
node --trace-gc --max-old-space-size=100 script.mjs
You should experience something similar, the only difference should be that the last GC trace will contain a bigger heap size.
<--- Last few GCs --->
[40977:0x128008000] 2066 ms: Mark-sweep (reduce) 99.6 (102.5) -> 99.6 (102.5) MB, 46.7 / 0.0 ms (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 47 ms) (average mu = 0.154, current mu = 0.155) allocati[40977:0x128008000] 2123 ms: Mark-sweep (reduce) 99.6 (102.5) -> 99.6 (102.5) MB, 47.7 / 0.0 ms (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 48 ms) (average mu = 0.165, current mu = 0.175) allocati
Note: In the context of real application, it could be cumbersome to find the leaked object in the code. Heap snapshot could help you to find it. Visit the guide dedicated to heap snapshot
Slowness
How do you assert whether too many garbage collections are happening or causing an overhead?
- Review the trace data, precisely the time between consecutive collections.
- Review the trace data, specifically around time spent in GC.
- If the time between two GC is less than the time spent in GC, the application is severely starving.
- If the time between two GCS and the time spent in GC are very high, probably the application can use a smaller heap.
- If the time between two GCS is much greater than the time spent in GC, the application is relatively healthy.
Fix the leak
Now let's fix the leak. Instead of using an object to store our entries, we could use a file.
Let's modify our script a bit:
// script-fix.mjs
import module "node:fs/promises"
fs from 'node:fs/promises';
import module "node:os"
os from 'node:os';
let let len: number
len = 1_000_000;
const const fileName: string
fileName = `entries-${var Date: DateConstructor
Enables basic storage and retrieval of dates and times.Date.DateConstructor.now(): number
Returns the number of milliseconds elapsed since midnight, January 1, 1970 Universal Coordinated Time (UTC).now()}`;
async function function addEntry(): Promise<void>
addEntry() {
const const entry: {
timestamp: number;
memory: number;
totalMemory: number;
uptime: number;
}
entry = {
timestamp: number
timestamp: var Date: DateConstructor
Enables basic storage and retrieval of dates and times.Date.DateConstructor.now(): number
Returns the number of milliseconds elapsed since midnight, January 1, 1970 Universal Coordinated Time (UTC).now(),
memory: number
memory: module "node:os"
os.function freemem(): number
Returns the amount of free system memory in bytes as an integer.freemem(),
totalMemory: number
totalMemory: module "node:os"
os.function totalmem(): number
Returns the total amount of system memory in bytes as an integer.totalmem(),
uptime: number
uptime: module "node:os"
os.function uptime(): number
Returns the system uptime in number of seconds.uptime(),
};
await module "node:fs/promises"
fs.function appendFile(path: PathLike | fs.FileHandle, data: string | Uint8Array, options?: (ObjectEncodingOptions & fs.FlagAndOpenMode & {
flush?: boolean | undefined;
}) | BufferEncoding | null): Promise<void>
Asynchronously append data to a file, creating the file if it does not yet
exist. `data` can be a string or a `Buffer`.
If `options` is a string, then it specifies the `encoding`.
The `mode` option only affects the newly created file. See `fs.open()` for more details.
The `path` may be specified as a `FileHandle` that has been opened
for appending (using `fsPromises.open()`).appendFile(const fileName: string
fileName, var JSON: JSON
An intrinsic object that provides functions to convert JavaScript values to and from the JavaScript Object Notation (JSON) format.JSON.JSON.stringify(value: any, replacer?: (this: any, key: string, value: any) => any, space?: string | number): string (+1 overload)
Converts a JavaScript value to a JavaScript Object Notation (JSON) string.stringify(const entry: {
timestamp: number;
memory: number;
totalMemory: number;
uptime: number;
}
entry) + '\n');
}
async function function summary(): Promise<void>
summary() {
const const stats: Stats
stats = await module "node:fs/promises"
fs.function lstat(path: PathLike, opts?: StatOptions & {
bigint?: false | undefined;
}): Promise<Stats> (+2 overloads)
Equivalent to `fsPromises.stat()` unless `path` refers to a symbolic link,
in which case the link itself is stat-ed, not the file that it refers to.
Refer to the POSIX [`lstat(2)`](http://man7.org/linux/man-pages/man2/lstat.2.html) document for more detail.lstat(const fileName: string
fileName);
var console: Console
The `console` module provides a simple debugging console that is similar to the
JavaScript console mechanism provided by web browsers.
The module exports two specific components:
* A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream.
* A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v22.x/api/process.html#processstdout) and
[`process.stderr`](https://nodejs.org/docs/latest-v22.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module.
_**Warning**_: The global console object's methods are neither consistently
synchronous like the browser APIs they resemble, nor are they consistently
asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v22.x/api/process.html#a-note-on-process-io) for
more information.
Example using the global `console`:
```js
console.log('hello world');
// Prints: hello world, to stdout
console.log('hello %s', 'world');
// Prints: hello world, to stdout
console.error(new Error('Whoops, something bad happened'));
// Prints error message and stack trace to stderr:
// Error: Whoops, something bad happened
// at [eval]:5:15
// at Script.runInThisContext (node:vm:132:18)
// at Object.runInThisContext (node:vm:309:38)
// at node:internal/process/execution:77:19
// at [eval]-wrapper:6:22
// at evalScript (node:internal/process/execution:76:60)
// at node:internal/main/eval_string:23:3
const name = 'Will Robinson';
console.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to stderr
```
Example using the `Console` class:
```js
const out = getStreamSomehow();
const err = getStreamSomehow();
const myConsole = new console.Console(out, err);
myConsole.log('hello world');
// Prints: hello world, to out
myConsole.log('hello %s', 'world');
// Prints: hello world, to out
myConsole.error(new Error('Whoops, something bad happened'));
// Prints: [Error: Whoops, something bad happened], to err
const name = 'Will Robinson';
myConsole.warn(`Danger ${name}! Danger!`);
// Prints: Danger Will Robinson! Danger!, to err
```console.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the
first used as the primary message and all additional used as substitution
values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html)
(the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v22.x/api/util.html#utilformatformat-args)).
```js
const count = 5;
console.log('count: %d', count);
// Prints: count: 5, to stdout
console.log('count:', count);
// Prints: count: 5, to stdout
```
See [`util.format()`](https://nodejs.org/docs/latest-v22.x/api/util.html#utilformatformat-args) for more information.log(`File size ${const stats: Stats
stats.StatsBase<number>.size: number
size} bytes`);
}
// execution
(async () => {
await module "node:fs/promises"
fs.function writeFile(file: PathLike | fs.FileHandle, data: string | NodeJS.ArrayBufferView | Iterable<string | NodeJS.ArrayBufferView> | AsyncIterable<string | NodeJS.ArrayBufferView> | Stream, options?: (ObjectEncodingOptions & {
mode?: Mode | undefined;
flag?: OpenMode | undefined;
flush?: boolean | undefined;
} & EventEmitter<T extends EventMap<...> = DefaultEventMap>.Abortable) | BufferEncoding | null): Promise<void>
Asynchronously writes data to a file, replacing the file if it already exists. `data` can be a string, a buffer, an
[AsyncIterable](https://tc39.github.io/ecma262/#sec-asynciterable-interface), or an
[Iterable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#The_iterable_protocol) object.
The `encoding` option is ignored if `data` is a buffer.
If `options` is a string, then it specifies the encoding.
The `mode` option only affects the newly created file. See `fs.open()` for more details.
Any specified `FileHandle` has to support writing.
It is unsafe to use `fsPromises.writeFile()` multiple times on the same file
without waiting for the promise to be settled.
Similarly to `fsPromises.readFile` \- `fsPromises.writeFile` is a convenience
method that performs multiple `write` calls internally to write the buffer
passed to it. For performance sensitive code consider using `fs.createWriteStream()` or `filehandle.createWriteStream()`.
It is possible to use an `AbortSignal` to cancel an `fsPromises.writeFile()`.
Cancelation is "best effort", and some amount of data is likely still
to be written.
```js
import { writeFile } from 'node:fs/promises';
import { Buffer } from 'node:buffer';
try {
const controller = new AbortController();
const { signal } = controller;
const data = new Uint8Array(Buffer.from('Hello Node.js'));
const promise = writeFile('message.txt', data, { signal });
// Abort the request before the promise settles.
controller.abort();
await promise;
} catch (err) {
// When a request is aborted - err is an AbortError
console.error(err);
}
```
Aborting an ongoing request does not abort individual operating
system requests but rather the internal buffering `fs.writeFile` performs.writeFile(const fileName: string
fileName, '----START---\n');
while (let len: number
len > 0) {
await function addEntry(): Promise<void>
addEntry();
var process: NodeJS.Process
process.NodeJS.Process.stdout: NodeJS.WriteStream & {
fd: 1;
}
The `process.stdout` property returns a stream connected to`stdout` (fd `1`). It is a `net.Socket` (which is a `Duplex` stream) unless fd `1` refers to a file, in which case it is
a `Writable` stream.
For example, to copy `process.stdin` to `process.stdout`:
```js
import { stdin, stdout } from 'node:process';
stdin.pipe(stdout);
```
`process.stdout` differs from other Node.js streams in important ways. See `note on process I/O` for more information.stdout.Socket.write(buffer: Uint8Array | string, cb?: (err?: Error | null) => void): boolean (+1 overload)
Sends data on the socket. The second parameter specifies the encoding in the
case of a string. It defaults to UTF8 encoding.
Returns `true` if the entire data was flushed successfully to the kernel
buffer. Returns `false` if all or part of the data was queued in user memory.`'drain'` will be emitted when the buffer is again free.
The optional `callback` parameter will be executed when the data is finally
written out, which may not be immediately.
See `Writable` stream `write()` method for more
information.write(`~~> ${let len: number
len} entries to record\r`);
let len: number
len--;
}
await function summary(): Promise<void>
summary();
})();
Using a Set
to store data is not a bad practice at all;
you should just care about the memory footprint of your program.
Note: you can find the source code of this exercise in the Node.js Diagnostics repository.
Now, let's execute this script.
node --trace-gc script-fix.mjs
You should observe two things:
- Mark-sweep events appear less frequently
- the memory footprint doesn't exceed 25MB versus more than 130MB with the first script.
It makes a lot of sense as the new version puts less pressure on the memory than the first one.
Takeaway: What do you think about improving this script?
You probably see that the new version of the script is slow.
What if we use a Set
again and write its content into a
file only when the memory reaches a specific size?
getheapstatistics
API could help you.
Bonus: Trace garbage collection programmatically
Using v8
module
You might want to avoid getting traces from the entire lifetime of your process.
In that case, set the flag from within the process.
The v8
module exposes an API to put flags on the fly.
import module "v8"
The `node:v8` module exposes APIs that are specific to the version of [V8](https://developers.google.com/v8/) built into the Node.js binary. It can be accessed using:
```js
import v8 from 'node:v8';
```v8 from 'v8';
// enabling trace-gc
module "v8"
The `node:v8` module exposes APIs that are specific to the version of [V8](https://developers.google.com/v8/) built into the Node.js binary. It can be accessed using:
```js
import v8 from 'node:v8';
```v8.function setFlagsFromString(flags: string): void
The `v8.setFlagsFromString()` method can be used to programmatically set
V8 command-line flags. This method should be used with care. Changing settings
after the VM has started may result in unpredictable behavior, including
crashes and data loss; or it may simply do nothing.
The V8 options available for a version of Node.js may be determined by running `node --v8-options`.
Usage:
```js
// Print GC events to stdout for one minute.
import v8 from 'node:v8';
v8.setFlagsFromString('--trace_gc');
setTimeout(() => { v8.setFlagsFromString('--notrace_gc'); }, 60e3);
```setFlagsFromString('--trace-gc');
// disabling trace-gc
module "v8"
The `node:v8` module exposes APIs that are specific to the version of [V8](https://developers.google.com/v8/) built into the Node.js binary. It can be accessed using:
```js
import v8 from 'node:v8';
```v8.function setFlagsFromString(flags: string): void
The `v8.setFlagsFromString()` method can be used to programmatically set
V8 command-line flags. This method should be used with care. Changing settings
after the VM has started may result in unpredictable behavior, including
crashes and data loss; or it may simply do nothing.
The V8 options available for a version of Node.js may be determined by running `node --v8-options`.
Usage:
```js
// Print GC events to stdout for one minute.
import v8 from 'node:v8';
v8.setFlagsFromString('--trace_gc');
setTimeout(() => { v8.setFlagsFromString('--notrace_gc'); }, 60e3);
```setFlagsFromString('--notrace-gc');
Using performance hooks
In Node.js, you can use performance hooks to trace garbage collection.
const { class PerformanceObserver
PerformanceObserver } = var require: NodeJS.Require
(id: string) => any
Used to import modules, `JSON`, and local files.require('node:perf_hooks');
// Create a performance observer
const const obs: PerformanceObserver
obs = new new PerformanceObserver(callback: PerformanceObserverCallback): PerformanceObserver
PerformanceObserver(list: PerformanceObserverEntryList
list => {
const const entry: PerformanceEntry
entry = list: PerformanceObserverEntryList
list.PerformanceObserverEntryList.getEntries(): PerformanceEntry[]
Returns a list of `PerformanceEntry` objects in chronological order
with respect to `performanceEntry.startTime`.
```js
import {
performance,
PerformanceObserver,
} from 'node:perf_hooks';
const obs = new PerformanceObserver((perfObserverList, observer) => {
console.log(perfObserverList.getEntries());
* [
* PerformanceEntry {
* name: 'test',
* entryType: 'mark',
* startTime: 81.465639,
* duration: 0,
* detail: null
* },
* PerformanceEntry {
* name: 'meow',
* entryType: 'mark',
* startTime: 81.860064,
* duration: 0,
* detail: null
* }
* ]
performance.clearMarks();
performance.clearMeasures();
observer.disconnect();
});
obs.observe({ type: 'mark' });
performance.mark('test');
performance.mark('meow');
```getEntries()[0];
/*
The entry is an instance of PerformanceEntry containing
metrics of a single garbage collection event.
For example:
PerformanceEntry {
name: 'gc',
entryType: 'gc',
startTime: 2820.567669,
duration: 1.315709,
kind: 1
}
*/
});
// Subscribe to notifications of GCs
const obs: PerformanceObserver
obs.PerformanceObserver.observe(options: {
entryTypes: readonly EntryType[];
buffered?: boolean | undefined;
} | {
type: EntryType;
buffered?: boolean | undefined;
}): void
Subscribes the `PerformanceObserver` instance to notifications of new `PerformanceEntry` instances identified either by `options.entryTypes` or `options.type`:
```js
import {
performance,
PerformanceObserver,
} from 'node:perf_hooks';
const obs = new PerformanceObserver((list, observer) => {
// Called once asynchronously. `list` contains three items.
});
obs.observe({ type: 'mark' });
for (let n = 0; n < 3; n++)
performance.mark(`test${n}`);
```observe({ entryTypes: readonly EntryType[]
entryTypes: ['gc'] });
// Stop subscription
const obs: PerformanceObserver
obs.PerformanceObserver.disconnect(): void
Disconnects the `PerformanceObserver` instance from all notifications.disconnect();
Examining a trace with performance hooks
You can get GC statistics as PerformanceEntry from the callback in PerformanceObserver.
For example:
{
"name": "gc",
"entryType": "gc",
"startTime": 2820.567669,
"duration": 1.315709,
"kind": 1
}
Property | Interpretation |
---|---|
name | The name of the performance entry. |
entryType | The type of the performance entry. |
startTime | The high-resolution millisecond timestamp is marking the starting time of the Performance Entry. |
duration | The total number of milliseconds elapsed for this entry. |
kind | The type of garbage collection operation that occurred. |
flags | The additional information about GC. |
For more information, you can refer to the documentation about performance hooks.