← All guides

Build an AI Chatbot with Next.js and Claude: Step-by-Step Guide

A complete guide to building a streaming AI chatbot with Next.js App Router and the Claude API — API route, streaming, conversation history, and UI in.

Build an AI Chatbot with Next.js and Claude: Step-by-Step Guide

You can build a working AI chatbot with Next.js and Claude in under 2 hours. The core stack: Next.js App Router API route for the backend, the Anthropic SDK for streaming responses, and a React component for the chat UI. This guide covers each piece with complete code — no boilerplate repos to clone, just the exact code that works.


What you'll build

A streaming chatbot with:


Prerequisites


Step 1: Create the Next.js project

bunx create-next-app@latest claude-chatbot --typescript --tailwind --app --no-src-dir
cd claude-chatbot
bun add @anthropic-ai/sdk

Step 2: Set up the API key

Create .env.local:

ANTHROPIC_API_KEY=sk-ant-your-key-here

Add to .gitignore (already there if you used create-next-app, but verify):

.env.local

Step 3: Create the API route

Create app/api/chat/route.ts:

import Anthropic from "@anthropic-ai/sdk";
import { NextRequest } from "next/server";

const client = new Anthropic();

export const runtime = "nodejs"; // Not edge — Anthropic SDK requires Node.js

export async function POST(request: NextRequest) {
  try {
    const { messages, system } = await request.json();

    // Validate input
    if (!Array.isArray(messages) || messages.length === 0) {
      return new Response("Invalid messages", { status: 400 });
    }

    // Create streaming response
    const stream = await client.messages.stream({
      model: "claude-sonnet-4-5",
      max_tokens: 2048,
      system: system || "You are a helpful assistant.",
      messages: messages.map((m: { role: string; content: string }) => ({
        role: m.role as "user" | "assistant",
        content: m.content,
      })),
    });

    // Return a ReadableStream that pipes the text deltas
    const readable = new ReadableStream({
      async start(controller) {
        try {
          for await (const event of stream) {
            if (
              event.type === "content_block_delta" &&
              event.delta.type === "text_delta"
            ) {
              controller.enqueue(
                new TextEncoder().encode(event.delta.text)
              );
            }
          }
        } catch (err) {
          controller.error(err);
        } finally {
          controller.close();
        }
      },
    });

    return new Response(readable, {
      headers: {
        "Content-Type": "text/plain; charset=utf-8",
        "Transfer-Encoding": "chunked",
        "Cache-Control": "no-cache",
      },
    });
  } catch (error) {
    if (error instanceof Anthropic.APIError) {
      return new Response(`API Error: ${error.message}`, {
        status: error.status ?? 500,
      });
    }
    return new Response("Internal server error", { status: 500 });
  }
}

Step 4: Create the chat hook

Create app/hooks/useChat.ts:

import { useState, useCallback } from "react";

export interface Message {
  id: string;
  role: "user" | "assistant";
  content: string;
}

export function useChat(systemPrompt?: string) {
  const [messages, setMessages] = useState<Message[]>([]);
  const [isLoading, setIsLoading] = useState(false);
  const [error, setError] = useState<string | null>(null);

  const sendMessage = useCallback(
    async (content: string) => {
      const userMessage: Message = {
        id: crypto.randomUUID(),
        role: "user",
        content,
      };

      // Add user message to state
      const updatedMessages = [...messages, userMessage];
      setMessages(updatedMessages);
      setIsLoading(true);
      setError(null);

      // Add placeholder for assistant response
      const assistantId = crypto.randomUUID();
      setMessages((prev) => [
        ...prev,
        { id: assistantId, role: "assistant", content: "" },
      ]);

      try {
        const response = await fetch("/api/chat", {
          method: "POST",
          headers: { "Content-Type": "application/json" },
          body: JSON.stringify({
            messages: updatedMessages.map((m) => ({
              role: m.role,
              content: m.content,
            })),
            system: systemPrompt,
          }),
        });

        if (!response.ok) {
          throw new Error(`API error: ${response.status}`);
        }

        // Stream the response
        const reader = response.body?.getReader();
        const decoder = new TextDecoder();

        if (!reader) throw new Error("No response body");

        while (true) {
          const { done, value } = await reader.read();
          if (done) break;

          const chunk = decoder.decode(value, { stream: true });
          setMessages((prev) =>
            prev.map((m) =>
              m.id === assistantId
                ? { ...m, content: m.content + chunk }
                : m
            )
          );
        }
      } catch (err) {
        setError(err instanceof Error ? err.message : "Unknown error");
        // Remove the placeholder on error
        setMessages((prev) => prev.filter((m) => m.id !== assistantId));
      } finally {
        setIsLoading(false);
      }
    },
    [messages, systemPrompt]
  );

  const clearMessages = useCallback(() => {
    setMessages([]);
    setError(null);
  }, []);

  return { messages, isLoading, error, sendMessage, clearMessages };
}

Step 5: Build the chat UI

Replace app/page.tsx:

"use client";

import { useState, useRef, useEffect } from "react";
import { useChat } from "./hooks/useChat";

export default function ChatPage() {
  const { messages, isLoading, error, sendMessage, clearMessages } = useChat(
    "You are a helpful assistant. Be concise and direct."
  );
  const [input, setInput] = useState("");
  const bottomRef = useRef<HTMLDivElement>(null);

  // Auto-scroll to bottom on new messages
  useEffect(() => {
    bottomRef.current?.scrollIntoView({ behavior: "smooth" });
  }, [messages]);

  const handleSubmit = async (e: React.FormEvent) => {
    e.preventDefault();
    const trimmed = input.trim();
    if (!trimmed || isLoading) return;
    setInput("");
    await sendMessage(trimmed);
  };

  return (
    <main className="flex flex-col h-screen max-w-2xl mx-auto p-4">
      <div className="flex justify-between items-center mb-4">
        <h1 className="text-xl font-bold">Claude Chat</h1>
        <button
          onClick={clearMessages}
          className="text-sm text-gray-500 hover:text-gray-800"
        >
          Clear
        </button>
      </div>

      {/* Messages */}
      <div className="flex-1 overflow-y-auto space-y-4 mb-4">
        {messages.length === 0 && (
          <p className="text-gray-400 text-center mt-20">
            Start a conversation...
          </p>
        )}
        {messages.map((message) => (
          <div
            key={message.id}
            className={`flex ${
              message.role === "user" ? "justify-end" : "justify-start"
            }`}
          >
            <div
              className={`max-w-[80%] rounded-2xl px-4 py-2 ${
                message.role === "user"
                  ? "bg-blue-600 text-white"
                  : "bg-gray-100 text-gray-900"
              }`}
            >
              {message.content || (
                <span className="animate-pulse">▌</span>
              )}
            </div>
          </div>
        ))}
        {error && (
          <div className="text-red-500 text-sm text-center">{error}</div>
        )}
        <div ref={bottomRef} />
      </div>

      {/* Input */}
      <form onSubmit={handleSubmit} className="flex gap-2">
        <input
          value={input}
          onChange={(e) => setInput(e.target.value)}
          placeholder="Type a message..."
          disabled={isLoading}
          className="flex-1 rounded-xl border px-4 py-2 focus:outline-none focus:ring-2 focus:ring-blue-500 disabled:opacity-50"
        />
        <button
          type="submit"
          disabled={isLoading || !input.trim()}
          className="bg-blue-600 text-white rounded-xl px-4 py-2 hover:bg-blue-700 disabled:opacity-50 disabled:cursor-not-allowed"
        >
          {isLoading ? "..." : "Send"}
        </button>
      </form>
    </main>
  );
}

Step 6: Run it

bun dev

Open http://localhost:3000. You have a working streaming chatbot.


Extensions

Add message persistence with localStorage

// In useChat.ts, load/save messages from localStorage
const [messages, setMessages] = useState<Message[]>(() => {
  if (typeof window === "undefined") return [];
  const saved = localStorage.getItem("chat-messages");
  return saved ? JSON.parse(saved) : [];
});

// Save on change
useEffect(() => {
  localStorage.setItem("chat-messages", JSON.stringify(messages));
}, [messages]);

Limit conversation history to control costs

// In route.ts, only send the last 20 messages
const trimmedMessages = messages.slice(-20);

Add markdown rendering

bun add react-markdown
import ReactMarkdown from "react-markdown";

// Replace the message content rendering:
<ReactMarkdown className="prose prose-sm max-w-none">
  {message.content}
</ReactMarkdown>

Frequently asked questions

Why runtime = "nodejs" instead of edge runtime? The Anthropic SDK uses Node.js-specific APIs that aren't available in the edge runtime. If you need edge deployment, use the raw fetch API to call the Anthropic API directly instead of the SDK.

How do I add a system prompt that users can't change? Keep the system prompt in the API route, not in the client. The client sends messages but not the system prompt — the server always appends its own system prompt before calling Claude.

Why does streaming sometimes show a delay before text appears? Most likely the response buffer size. Add X-Accel-Buffering: no to your response headers for Nginx-proxied deployments. On Vercel, streaming should work immediately.

How do I add authentication so only logged-in users can chat? Add Clerk middleware to protect the /api/chat route. The API route checks for a valid session before calling the Anthropic API. See the Solo AI Builder Stack guide for the full auth setup.

How much does it cost to run this chatbot? At Claude Sonnet pricing ($3/M input + $15/M output), a typical 500-token exchange (300 input, 200 output) costs about $0.004. At 1,000 messages/day, that's $4/day. With prompt caching on a consistent system prompt, costs drop further.


Related guides


Take It Further

Claude Agent SDK Cookbook: 40 Production Patterns — Pattern 1 covers the complete Chatbot Architecture: multi-session management, persistent history in PostgreSQL, streaming with Server-Sent Events, rate limiting per user, and the authentication layer.

→ Get the Agent SDK Cookbook — $49

30-day money-back guarantee. Instant download.

AI Disclosure: Drafted with Claude Code; all code verified with Next.js 15 and Anthropic SDK 0.39+ as of April 2026.

Tools and references