Skip to main content

Basic Setup

import { initialize } from "@anyway-sh/node-server-sdk";

initialize({
  appName: "my-app",
});

Collector Configuration

export ANYWAY_BASE_URL="https://collector.anyway.sh/"
export ANYWAY_HEADERS="Authorization=your-api-key"

Direct Configuration

import { initialize } from "@anyway-sh/node-server-sdk";

initialize({
  appName: "my-app",
  baseUrl: "https://collector.anyway.sh/",
  headers: { Authorization: "your-api-key" },
});
Avoid hardcoding API keys in your source code. Use environment variables or a secrets manager.

Configuration Options

ParameterTypeDefaultDescription
appNamestringnpm_package_nameApplication name for trace grouping
apiKeystringANYWAY_API_KEY env varAPI key — sent as Bearer-prefixed. Use headers instead for raw key auth
baseUrlstringANYWAY_BASE_URL or https://collector.anyway.shCollector endpoint URL
disableBatchbooleanfalseSend spans immediately (useful for debugging)
logLevel"debug" | "info" | "warn" | "error"SDK and instrumentation log level
exporterSpanExporterOTLP exporterCustom OpenTelemetry span exporter
processorSpanProcessorBatchSpanProcessorCustom OpenTelemetry span processor
headersRecord<string, string>Headers sent with trace data
instrumentModulesobjectExplicit module references for ESM projects
tracingEnabledbooleantrueEnable/disable tracing entirely
pricingEnabledbooleantrueCalculate and attach cost attributes to spans
pricingJsonPathstringPath to a custom pricing JSON file
silenceInitializationMessagebooleanfalseSuppress the startup console message

ESM / Next.js

ESM projects cannot use OpenTelemetry’s require-based auto-instrumentation. Pass provider modules explicitly:
import { initialize } from "@anyway-sh/node-server-sdk";
import OpenAI from "openai";
import * as anthropic from "@anthropic-ai/sdk";

initialize({
  appName: "my-app",
  baseUrl: "https://collector.anyway.sh/",
  headers: { Authorization: "your-api-key" },
  instrumentModules: {
    openAI: OpenAI,
    anthropic: anthropic,
  },
});

Available instrumentModules

KeyTypeProvider
openAItypeof OpenAIOpenAI
anthropictypeof anthropicAnthropic
coheretypeof cohereCohere
bedrocktypeof bedrockAWS Bedrock
google_vertexaitypeof vertexaiGoogle Vertex AI
google_aiplatformtypeof aiplatformGoogle AI Platform
pineconetypeof pineconePinecone
chromadbtypeof chromadbChromaDB
qdranttypeof qdrantQdrant
togethertypeof TogetherTogether AI
llamaIndextypeof llamaindexLlamaIndex
mcptypeof mcpMCP (Model Context Protocol)

Custom Exporters

For advanced use cases, provide a custom OpenTelemetry exporter:
import { initialize } from "@anyway-sh/node-server-sdk";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-proto";

const customExporter = new OTLPTraceExporter({
  url: "https://your-collector:4318/v1/traces",
  headers: { Authorization: "Bearer your-key" },
});

initialize({
  appName: "my-app",
  exporter: customExporter,
});

Custom Pricing

Provide your own pricing data to calculate costs for models not included in the default pricing:
initialize({
  appName: "my-app",
  pricingJsonPath: "./pricing.json",
});
To disable cost calculation:
initialize({
  appName: "my-app",
  pricingEnabled: false,
});

Pricing JSON Format

{
  "chat": {
    "model-name": {
      "promptPrice": 0.00015,
      "completionPrice": 0.0006
    }
  }
}
  • promptPrice: Cost per 1K input tokens (USD)
  • completionPrice: Cost per 1K output tokens (USD)

Next Steps

Tracing LLM Calls

Trace OpenAI and Anthropic API calls