Basic Setup
import { initialize } from "@anyway-sh/node-server-sdk";
initialize({
appName: "my-app",
});
Collector Configuration
Environment Variables (Recommended)
export ANYWAY_BASE_URL="https://collector.anyway.sh/"
export ANYWAY_HEADERS="Authorization=your-api-key"
Direct Configuration
import { initialize } from "@anyway-sh/node-server-sdk";
initialize({
appName: "my-app",
baseUrl: "https://collector.anyway.sh/",
headers: { Authorization: "your-api-key" },
});
Avoid hardcoding API keys in your source code. Use environment variables or a secrets manager.
Configuration Options
| Parameter | Type | Default | Description |
|---|
appName | string | npm_package_name | Application name for trace grouping |
apiKey | string | ANYWAY_API_KEY env var | API key — sent as Bearer-prefixed. Use headers instead for raw key auth |
baseUrl | string | ANYWAY_BASE_URL or https://collector.anyway.sh | Collector endpoint URL |
disableBatch | boolean | false | Send spans immediately (useful for debugging) |
logLevel | "debug" | "info" | "warn" | "error" | — | SDK and instrumentation log level |
exporter | SpanExporter | OTLP exporter | Custom OpenTelemetry span exporter |
processor | SpanProcessor | BatchSpanProcessor | Custom OpenTelemetry span processor |
headers | Record<string, string> | — | Headers sent with trace data |
instrumentModules | object | — | Explicit module references for ESM projects |
tracingEnabled | boolean | true | Enable/disable tracing entirely |
pricingEnabled | boolean | true | Calculate and attach cost attributes to spans |
pricingJsonPath | string | — | Path to a custom pricing JSON file |
silenceInitializationMessage | boolean | false | Suppress the startup console message |
ESM / Next.js
ESM projects cannot use OpenTelemetry’s require-based auto-instrumentation. Pass provider modules explicitly:
import { initialize } from "@anyway-sh/node-server-sdk";
import OpenAI from "openai";
import * as anthropic from "@anthropic-ai/sdk";
initialize({
appName: "my-app",
baseUrl: "https://collector.anyway.sh/",
headers: { Authorization: "your-api-key" },
instrumentModules: {
openAI: OpenAI,
anthropic: anthropic,
},
});
Available instrumentModules
| Key | Type | Provider |
|---|
openAI | typeof OpenAI | OpenAI |
anthropic | typeof anthropic | Anthropic |
cohere | typeof cohere | Cohere |
bedrock | typeof bedrock | AWS Bedrock |
google_vertexai | typeof vertexai | Google Vertex AI |
google_aiplatform | typeof aiplatform | Google AI Platform |
pinecone | typeof pinecone | Pinecone |
chromadb | typeof chromadb | ChromaDB |
qdrant | typeof qdrant | Qdrant |
together | typeof Together | Together AI |
llamaIndex | typeof llamaindex | LlamaIndex |
mcp | typeof mcp | MCP (Model Context Protocol) |
Custom Exporters
For advanced use cases, provide a custom OpenTelemetry exporter:
import { initialize } from "@anyway-sh/node-server-sdk";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";
const customExporter = new OTLPTraceExporter({
url: "https://your-collector:4318/v1/traces",
headers: { Authorization: "Bearer your-key" },
});
initialize({
appName: "my-app",
exporter: customExporter,
});
Custom Pricing
Provide your own pricing data to calculate costs for models not included in the default pricing:
initialize({
appName: "my-app",
pricingJsonPath: "./pricing.json",
});
To disable cost calculation:
initialize({
appName: "my-app",
pricingEnabled: false,
});
{
"chat": {
"model-name": {
"promptPrice": 0.00015,
"completionPrice": 0.0006
}
}
}
promptPrice: Cost per 1K input tokens (USD)
completionPrice: Cost per 1K output tokens (USD)
Next Steps
Tracing LLM Calls
Trace OpenAI and Anthropic API calls