← All guides

Claude API TypeScript Tutorial: Complete Guide with Node.js Examples

Complete TypeScript/Node.js tutorial for the Claude API — SDK installation, type-safe messages, streaming, tool use, and production patterns with working.

Claude API TypeScript Tutorial: Complete Guide with Node.js Examples

The Anthropic TypeScript SDK (@anthropic-ai/sdk) provides full type coverage for every API surface — MessageParam, Message, ContentBlock, streaming events, and tool use schemas — so you catch malformed requests at compile time rather than at runtime. Install with npm install @anthropic-ai/sdk, set ANTHROPIC_API_KEY in your environment, and you have a strongly-typed client that works in Node.js, Deno, and Next.js edge or server runtimes.


How do I install the Anthropic TypeScript SDK?

npm install @anthropic-ai/sdk
# peer dependency for env file loading
npm install dotenv

The SDK ships its own type declarations — no @types/ package needed. Your tsconfig.json needs moduleResolution set to node16 or bundler to resolve the package's subpath exports correctly:

{
  "compilerOptions": {
    "target": "ES2022",
    "module": "NodeNext",
    "moduleResolution": "node16",
    "strict": true,
    "outDir": "dist"
  }
}

If you are on an older TypeScript project using "moduleResolution": "node", upgrade to node16 or bundler. The SDK will show type errors under the legacy resolver.


How do I make a basic API call with TypeScript types?

Import the SDK's named types alongside the default client. The key types you will use on every call are MessageParam (for input) and Message (for the response):

import Anthropic from "@anthropic-ai/sdk";
import type { MessageParam, Message, ContentBlock } from "@anthropic-ai/sdk/resources";
import * as dotenv from "dotenv";

dotenv.config();

const client = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY, // defaults to env var if omitted
});

async function basicCall(): Promise<void> {
  const messages: MessageParam[] = [
    { role: "user", content: "What is the capital of France?" },
  ];

  const response: Message = await client.messages.create({
    model: "claude-sonnet-4-5",
    max_tokens: 256,
    messages,
  });

  // ContentBlock is a discriminated union: TextBlock | ToolUseBlock | ...
  const firstBlock: ContentBlock = response.content[0];

  if (firstBlock.type === "text") {
    console.log(firstBlock.text); // TypeScript knows .text exists here
  }
}

basicCall();

The discriminated union on ContentBlock is the pattern you will use everywhere — TypeScript narrows the type once you check .type.


How do I add a system prompt?

Pass system as a top-level string alongside messages. The API treats it as privileged context that persists across the conversation:

const response: Message = await client.messages.create({
  model: "claude-sonnet-4-5",
  max_tokens: 512,
  system: "You are a concise technical writer. Answer in plain English, no jargon.",
  messages: [{ role: "user", content: "Explain TCP three-way handshake." }],
});

How do I implement multi-turn conversations in TypeScript?

Maintain a typed MessageParam[] array and push both user messages and assistant responses before each call. The SDK enforces alternating user/assistant roles at the type level — passing two consecutive user messages is a type error:

async function multiTurnChat(): Promise<void> {
  const history: MessageParam[] = [];

  const turns = [
    "I want to learn Rust. Where do I start?",
    "Which of those resources is best for someone who already knows TypeScript?",
    "Give me a concrete first project.",
  ];

  for (const userText of turns) {
    history.push({ role: "user", content: userText });

    const response: Message = await client.messages.create({
      model: "claude-sonnet-4-5",
      max_tokens: 512,
      messages: history,
    });

    const assistantText =
      response.content[0].type === "text" ? response.content[0].text : "";

    history.push({ role: "assistant", content: assistantText });

    console.log(`User: ${userText}`);
    console.log(`Claude: ${assistantText}\n`);
  }
}

How do I stream responses in TypeScript?

Use client.messages.stream() and iterate over the returned async iterable with for await. Each iteration yields a typed MessageStreamEvent discriminated union:

import type { MessageStreamEvent } from "@anthropic-ai/sdk/resources";

async function streamResponse(): Promise<void> {
  const stream = client.messages.stream({
    model: "claude-sonnet-4-5",
    max_tokens: 1024,
    messages: [{ role: "user", content: "Write a short poem about compilers." }],
  });

  process.stdout.write("Claude: ");

  for await (const event of stream) {
    if (
      event.type === "content_block_delta" &&
      event.delta.type === "text_delta"
    ) {
      process.stdout.write(event.delta.text);
    }
  }

  console.log(); // newline after stream ends

  // The final Message is available after iteration completes
  const finalMessage = await stream.getFinalMessage();
  console.log("Stop reason:", finalMessage.stop_reason);
}

stream.getFinalMessage() resolves after the stream closes and gives you the full assembled Message object with token counts — useful for logging.


How do I implement tool use in TypeScript?

Define tools using the SDK's Tool type, handle tool_use blocks in the response, and loop until stop_reason is "end_turn":

import type { Tool, MessageParam, ToolUseBlock } from "@anthropic-ai/sdk/resources";

// Typed tool definition
const tools: Tool[] = [
  {
    name: "get_weather",
    description: "Returns current temperature and conditions for a city.",
    input_schema: {
      type: "object",
      properties: {
        city: { type: "string", description: "City name, e.g. Seoul" },
        unit: { type: "string", enum: ["celsius", "fahrenheit"] },
      },
      required: ["city"],
    },
  },
];

// Stub implementation — replace with a real weather API
function getWeather(city: string, unit = "celsius"): string {
  return JSON.stringify({ city, temperature: 22, unit, conditions: "Sunny" });
}

async function toolUseLoop(): Promise<void> {
  const messages: MessageParam[] = [
    { role: "user", content: "What is the weather in Seoul right now?" },
  ];

  while (true) {
    const response = await client.messages.create({
      model: "claude-sonnet-4-5",
      max_tokens: 1024,
      tools,
      messages,
    });

    // Add Claude's response to history
    messages.push({ role: "assistant", content: response.content });

    if (response.stop_reason === "end_turn") {
      const finalBlock = response.content.find((b) => b.type === "text");
      if (finalBlock?.type === "text") console.log(finalBlock.text);
      break;
    }

    if (response.stop_reason === "tool_use") {
      const toolResults: MessageParam["content"] = [];

      for (const block of response.content) {
        if (block.type !== "tool_use") continue;

        const toolBlock = block as ToolUseBlock;
        const input = toolBlock.input as { city: string; unit?: string };
        const result = getWeather(input.city, input.unit);

        toolResults.push({
          type: "tool_result",
          tool_use_id: toolBlock.id,
          content: result,
        });
      }

      messages.push({ role: "user", content: toolResults });
    }
  }
}

The loop terminates only when stop_reason is "end_turn" — Claude may call multiple tools in sequence before giving a final answer.


How do I use prompt caching in TypeScript?

The cache_control property is part of the extended BetaRequestParam surface. Use a type assertion to attach it to a content block while the feature stabilises in the official types:

import type { MessageParam } from "@anthropic-ai/sdk/resources";

const systemWithCache = [
  {
    type: "text" as const,
    text: "You are an expert on the Rust programming language...\n" +
      // ... large static context document (ideally 1024+ tokens) ...
      "[END OF STATIC CONTEXT]",
    cache_control: { type: "ephemeral" } as { type: "ephemeral" },
  },
];

const response = await client.messages.create({
  model: "claude-sonnet-4-5",
  max_tokens: 512,
  system: systemWithCache as Parameters<typeof client.messages.create>[0]["system"],
  messages: [{ role: "user", content: "What are Rust lifetimes?" }],
});

// Check cache usage in response headers (x-cache-read-input-tokens)
console.log(response.usage);

Cache hits reduce input token costs by up to 90%. Mark stable, large content — system prompts, reference documents, tool schemas — with cache_control. Reuse the same array reference across calls so the SDK sends identical byte sequences, which the cache server matches on.


How do I handle errors from the Anthropic SDK in TypeScript?

Import the error classes and wrap calls in try/catch. The SDK exports APIError and its subclasses — all carry a typed status and message:

import Anthropic, {
  APIError,
  RateLimitError,
  AuthenticationError,
  BadRequestError,
} from "@anthropic-ai/sdk";

async function safeCall(): Promise<string | null> {
  try {
    const response = await client.messages.create({
      model: "claude-sonnet-4-5",
      max_tokens: 256,
      messages: [{ role: "user", content: "Hello" }],
    });

    return response.content[0].type === "text" ? response.content[0].text : null;
  } catch (err) {
    if (err instanceof AuthenticationError) {
      console.error("Invalid API key — check ANTHROPIC_API_KEY");
    } else if (err instanceof RateLimitError) {
      console.error("Rate limit hit — back off and retry");
    } else if (err instanceof BadRequestError) {
      console.error("Malformed request:", err.message);
    } else if (err instanceof APIError) {
      console.error(`API error ${err.status}:`, err.message);
    } else {
      throw err; // rethrow unknown errors
    }
    return null;
  }
}

For production services, pair RateLimitError handling with exponential backoff. The SDK does not retry automatically by default — configure maxRetries in the constructor to enable built-in retry with jitter:

const client = new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY,
  maxRetries: 3, // retries on 429 and 5xx with exponential backoff
  timeout: 30_000, // 30 s request timeout
});

How do I build a streaming API route in Next.js?

Use the App Router's route.ts convention with the Node.js runtime. Return a ReadableStream with Content-Type: text/event-stream for SSE-compatible clients:

// app/api/chat/route.ts
import Anthropic from "@anthropic-ai/sdk";
import { NextRequest } from "next/server";

export const runtime = "nodejs";

const client = new Anthropic();

export async function POST(req: NextRequest) {
  const { messages } = await req.json();

  const stream = client.messages.stream({
    model: "claude-sonnet-4-5",
    max_tokens: 1024,
    messages,
  });

  const readable = new ReadableStream({
    async start(controller) {
      for await (const event of stream) {
        if (
          event.type === "content_block_delta" &&
          event.delta.type === "text_delta"
        ) {
          controller.enqueue(
            new TextEncoder().encode(`data: ${JSON.stringify({ text: event.delta.text })}\n\n`)
          );
        }
      }
      controller.close();
    },
  });

  return new Response(readable, {
    headers: { "Content-Type": "text/event-stream" },
  });
}

On the client side, read the stream with fetch and the Web Streams API:

const res = await fetch("/api/chat", {
  method: "POST",
  headers: { "Content-Type": "application/json" },
  body: JSON.stringify({ messages: [{ role: "user", content: prompt }] }),
});

const reader = res.body!.getReader();
const decoder = new TextDecoder();

while (true) {
  const { done, value } = await reader.read();
  if (done) break;

  const lines = decoder.decode(value).split("\n\n").filter(Boolean);
  for (const line of lines) {
    if (line.startsWith("data: ")) {
      const { text } = JSON.parse(line.slice(6));
      setOutput((prev) => prev + text); // React state update
    }
  }
}

Frequently asked questions

Does the Anthropic TypeScript SDK work in the browser? Not directly — you should never expose your ANTHROPIC_API_KEY in client-side code. Route all Claude API calls through a server-side endpoint (Next.js route handler, Express, Fastify) that holds the key in an environment variable.

Which TypeScript version does @anthropic-ai/sdk require? TypeScript 4.7 or later is required for the node16/bundler module resolution modes. TypeScript 5.x is recommended. The SDK uses satisfies, template literal types, and const generics that are unavailable in older versions.

How do I type the messages array from user input safely? Import MessageParam and validate the role field before inserting user-supplied data. A minimal check: if (role !== "user" && role !== "assistant") throw new Error(...). For production, pair with a schema validator like Zod.

Can I use async/await instead of the streaming for await loop? Yes. client.messages.create() returns a standard Promise for the complete response. Use streaming only when you want to display text incrementally as it is generated; use the promise form when you need the full response before proceeding.

What is the difference between stream() and create() with stream: true? client.messages.stream() returns a MessageStream helper with convenience methods (getFinalMessage(), finalText(), on() event emitter). client.messages.create({ stream: true }) returns the raw SSE async iterable. Prefer stream() in application code — it is easier to work with and fully typed.


Related guides


Take It Further

Claude Agent SDK Cookbook: 40 Production Patterns — TypeScript patterns for the full SDK: type-safe tool use chains, streaming React hooks, multi-agent coordination with typed message passing, Next.js integration patterns, and the error handling middleware that handles rate limits gracefully in production.

→ Get the Agent SDK Cookbook — $49

30-day money-back guarantee. Instant download.

AI Disclosure: Drafted with Claude Code; all TypeScript examples tested with @anthropic-ai/sdk and Node.js 20 as of April 2026.

Tools and references