Quickstart (Node.js)
Prerequisites
- Node.js 18+ (Node 20 recommended)
- A package manager (
pnpm,npm, oryarn) - An OpenAI API key if you want to follow the adapter steps
1. Install the core packages
pnpm add @accordkit/tracer @accordkit/provider-openai openai
The tracer is framework-agnostic. You can drop it into an Express API, a background worker, or any environment that runs Node.
2. Emit traces locally with the file sink
import { FileSink, Tracer } from '@accordkit/tracer';
const tracer = new Tracer({
sink: new FileSink({ filePath: './traces.jsonl' }),
});
await tracer.message({
role: 'user',
content: 'Hello from AccordKit!',
});
console.log('Trace written to traces.jsonl');
Run the script and you will see newline-delimited JSON (JSONL) entries with span metadata and payloads. This is perfect for local debugging and for piping into the Viewer.
3. Forward events over HTTP
Switch to the HTTP sink when you want to deliver traces to a service or your own ingest endpoint.
import { HttpSink, Tracer } from '@accordkit/tracer';
const tracer = new Tracer({
sink: new HttpSink({
endpoint: 'https://example.com/accordkit/ingest',
headers: { 'x-api-key': process.env.ACCORDKIT_API_KEY ?? '' },
}),
});
The HTTP sink batches JSONL payloads, retries transient failures, and exposes flush() / close() for clean shutdowns. Use resolveIngestEndpoint if you adopt AccordKit’s default endpoint layout.
4. Wrap the OpenAI SDK
@accordkit/provider-openai gives you a drop-in instrumentation layer for the official OpenAI client.
import OpenAI from 'openai';
import { Tracer, FileSink } from '@accordkit/tracer';
import { withOpenAI } from '@accordkit/provider-openai';
const tracer = new Tracer({ sink: new FileSink() });
const client = withOpenAI(
new OpenAI({ apiKey: process.env.OPENAI_API_KEY }),
tracer,
);
const completion = await client.chat.completions.create({
model: 'gpt-4o',
messages: [
{ role: 'system', content: 'You are AccordKit.' },
{ role: 'user', content: 'Say hi!' },
],
});
console.log(completion.choices[0]?.message);
Every request/response is captured as a trace with token usage, timing information, and error metadata.
5. Explore traces with the Viewer
Install the Viewer locally or point it at the generated JSONL file:
pnpm add --global @accordkit/viewer
accordkit-viewer --input ./traces.jsonl
You will get a searchable UI with span timelines, streaming deltas, and prompt/response diffs.
Next steps
- Learn more about the event model the tracer emits.
- Review available sinks to decide how to route data in each environment.
- Dive deeper into the OpenAI adapter for streaming and advanced options.