Skip to content

⚛️ Frontend Streaming Updates

Now let’s update your React frontend to handle streaming responses. You’ll create a new component that connects to your streaming backend and displays text as it arrives.


Create src/StreamingChat.jsx and start with the imports and basic state:

import { useState, useRef } from "react";
import { Send, Bot, User } from "lucide-react";
function StreamingChat() {
const [messages, setMessages] = useState([]);
const [input, setInput] = useState("");
const [isStreaming, setIsStreaming] = useState(false);
const abortControllerRef = useRef(null);
// You'll add functions here next
return <div>Streaming chat coming soon...</div>;
}
export default StreamingChat;

What you’ve set up:

  • messages - Array to store all chat messages
  • input - Current text in the input field
  • isStreaming - Boolean to track if you’re receiving a stream
  • abortControllerRef - Way to cancel streams if needed

Step 2: Create the AI Message Placeholder Function

Section titled “Step 2: Create the AI Message Placeholder Function”

Add this function inside your component:

const createAiPlaceholder = () => {
const aiMessageId = Date.now() + 1;
const aiMessage = {
text: "",
isUser: false,
id: aiMessageId,
isStreaming: true,
};
setMessages((prev) => [...prev, aiMessage]);
return aiMessageId;
};

Why you need this:

  • Creates an empty AI message immediately when user sends
  • Users see the AI “thinking” right away
  • Returns the ID so you can update this specific message later

Add this function to handle the actual streaming:

const readStream = async (response, aiMessageId) => {
const reader = response.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
const chunk = decoder.decode(value, { stream: true });
// Update the AI message with new content
setMessages((prev) =>
prev.map((msg) =>
msg.id === aiMessageId ? { ...msg, text: msg.text + chunk } : msg
)
);
}
};

How this works:

  • getReader() - Gets a stream reader from the response
  • TextDecoder() - Converts binary data to text
  • Loop reads each chunk and immediately updates the UI
  • Each chunk gets appended to the existing text

Now add the main function that ties everything together. This is the heart of your streaming chat:

const sendMessage = async () => {
// Guard clause - don't send if input is empty or already streaming
if (!input.trim() || isStreaming) return;
// Add user message to chat immediately
const userMessage = { text: input, isUser: true, id: Date.now() };
setMessages((prev) => [...prev, userMessage]);
// Store input and clear the field
const currentInput = input;
setInput("");
setIsStreaming(true);
// Create AI placeholder message
const aiMessageId = createAiPlaceholder();
try {
// Create abort controller for canceling requests
abortControllerRef.current = new AbortController();
// Make streaming request to backend
const response = await fetch("http://localhost:8000/api/chat/stream", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ message: currentInput }),
signal: abortControllerRef.current.signal,
});
if (!response.ok) {
throw new Error("Failed to get response");
}
// Read the stream and update UI
await readStream(response, aiMessageId);
// Mark streaming as complete
setMessages((prev) =>
prev.map((msg) =>
msg.id === aiMessageId ? { ...msg, isStreaming: false } : msg
)
);
} catch (error) {
// Handle different types of errors
if (error.name === "AbortError") {
console.log("Request was cancelled");
} else {
console.error("Streaming error:", error);
setMessages((prev) =>
prev.map((msg) =>
msg.id === aiMessageId
? {
...msg,
text: "Sorry, something went wrong.",
isStreaming: false,
}
: msg
)
);
}
} finally {
// Always clean up, regardless of success or failure
setIsStreaming(false);
abortControllerRef.current = null;
}
};

Let’s break down each section:

if (!input.trim() || isStreaming) return;
const userMessage = { text: input, isUser: true, id: Date.now() };
setMessages((prev) => [...prev, userMessage]);

What happens here:

  • Check if input is empty or you’re already streaming (prevent double-sends)
  • Create user message object with unique ID
  • Add user message to chat immediately (instant feedback)
const currentInput = input;
setInput("");
setIsStreaming(true);
const aiMessageId = createAiPlaceholder();

What happens here:

  • Save the input text (you’ll clear the field but need the text for API)
  • Clear input field so user can type next message
  • Set streaming state to true (disables input, shows stop button)
  • Create empty AI message that you’ll fill with streaming text
abortControllerRef.current = new AbortController();
const response = await fetch("http://localhost:8000/api/chat/stream", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ message: currentInput }),
signal: abortControllerRef.current.signal,
});

What happens here:

  • Create abort controller (lets user cancel mid-stream)
  • Make POST request to your streaming endpoint
  • Include the signal so you can cancel if needed
  • Send the user’s message in the request body
await readStream(response, aiMessageId);
setMessages((prev) =>
prev.map((msg) =>
msg.id === aiMessageId ? { ...msg, isStreaming: false } : msg
)
);

What happens here:

  • Call your readStream function to process chunks
  • When done, find the AI message and mark streaming as complete
  • This removes the typing cursor and finalizes the message
if (error.name === "AbortError") {
console.log("Request was cancelled");
} else {
// Show error message in chat
}

What happens here:

  • AbortError means user clicked stop (this is normal)
  • Other errors are real problems (network, server issues)
  • You show a friendly error message in the chat instead of crashing
finally {
setIsStreaming(false)
abortControllerRef.current = null
}

What happens here:

  • finally always runs, even if there were errors
  • Re-enable the input field
  • Clean up the abort controller reference

The complete flow:

  1. User types message and presses send
  2. User message appears instantly in chat
  3. Empty AI message appears with typing cursor
  4. Request goes to backend streaming endpoint
  5. Text chunks come back and fill up the AI message
  6. When done, typing cursor disappears
  7. Input is re-enabled for next message

Add these small helper functions:

const handleKeyPress = (e) => {
if (e.key === "Enter" && !e.shiftKey && !isStreaming) {
e.preventDefault();
sendMessage();
}
};
const stopStreaming = () => {
if (abortControllerRef.current) {
abortControllerRef.current.abort();
}
};

What these do:

  • handleKeyPress - Send message when user presses Enter
  • stopStreaming - Cancel the stream if user wants to stop

Replace your return statement with this complete UI:

return (
<div className="min-h-screen bg-gray-100 flex items-center justify-center p-4">
<div className="bg-white rounded-lg shadow-lg w-full max-w-2xl h-[600px] flex flex-col">
{/* Header */}
<div className="bg-blue-500 text-white p-4 rounded-t-lg">
<h1 className="text-xl font-bold">Streaming AI Chat</h1>
<p className="text-blue-100">Real-time responses!</p>
</div>
{/* Messages */}
<div className="flex-1 overflow-y-auto p-4 space-y-4">
{messages.length === 0 && (
<div className="text-center text-gray-500 mt-20">
<Bot className="w-12 h-12 mx-auto mb-4 text-gray-400" />
<p>Send a message to see streaming in action!</p>
</div>
)}
{messages.map((message) => (
<div
key={message.id}
className={`flex items-start space-x-3 ${
message.isUser ? "justify-end" : "justify-start"
}`}
>
{!message.isUser && (
<div className="bg-blue-500 p-2 rounded-full">
<Bot className="w-4 h-4 text-white" />
</div>
)}
<div
className={`max-w-xs lg:max-w-md px-4 py-2 rounded-lg ${
message.isUser
? "bg-blue-500 text-white"
: "bg-gray-200 text-gray-800"
}`}
>
{message.text}
{message.isStreaming && (
<span className="inline-block w-2 h-4 bg-blue-500 ml-1 animate-pulse" />
)}
</div>
{message.isUser && (
<div className="bg-gray-500 p-2 rounded-full">
<User className="w-4 h-4 text-white" />
</div>
)}
</div>
))}
</div>
{/* Input */}
<div className="border-t p-4">
<div className="flex space-x-2">
<input
type="text"
value={input}
onChange={(e) => setInput(e.target.value)}
onKeyPress={handleKeyPress}
placeholder="Type your message..."
className="flex-1 border border-gray-300 rounded-lg px-4 py-2 focus:outline-none focus:ring-2 focus:ring-blue-500"
disabled={isStreaming}
/>
{isStreaming ? (
<button
onClick={stopStreaming}
className="bg-red-500 hover:bg-red-600 text-white px-4 py-2 rounded-lg transition-colors"
>
Stop
</button>
) : (
<button
onClick={sendMessage}
disabled={!input.trim()}
className="bg-blue-500 hover:bg-blue-600 disabled:bg-gray-300 text-white p-2 rounded-lg transition-colors"
>
<Send className="w-5 h-5" />
</button>
)}
</div>
</div>
</div>
</div>
);

Here’s your complete component with all pieces together:

import { useState, useRef } from "react";
import { Send, Bot, User } from "lucide-react";
function StreamingChat() {
const [messages, setMessages] = useState([]);
const [input, setInput] = useState("");
const [isStreaming, setIsStreaming] = useState(false);
const abortControllerRef = useRef(null);
const sendMessage = async () => {
if (!input.trim() || isStreaming) return;
const userMessage = { text: input, isUser: true, id: Date.now() };
setMessages((prev) => [...prev, userMessage]);
const currentInput = input;
setInput("");
setIsStreaming(true);
// Create AI message placeholder
const aiMessageId = Date.now() + 1;
const aiMessage = {
text: "",
isUser: false,
id: aiMessageId,
isStreaming: true,
};
setMessages((prev) => [...prev, aiMessage]);
try {
// Create abort controller for canceling requests
abortControllerRef.current = new AbortController();
const response = await fetch("http://localhost:8000/api/chat/stream", {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify({ message: currentInput }),
signal: abortControllerRef.current.signal,
});
if (!response.ok) {
throw new Error("Failed to get response");
}
// Read the stream
const reader = response.body.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
const chunk = decoder.decode(value, { stream: true });
// Update the AI message with new content
setMessages((prev) =>
prev.map((msg) =>
msg.id === aiMessageId ? { ...msg, text: msg.text + chunk } : msg
)
);
}
// Mark streaming as complete
setMessages((prev) =>
prev.map((msg) =>
msg.id === aiMessageId ? { ...msg, isStreaming: false } : msg
)
);
} catch (error) {
if (error.name === "AbortError") {
console.log("Request was cancelled");
} else {
console.error("Streaming error:", error);
// Update AI message with error
setMessages((prev) =>
prev.map((msg) =>
msg.id === aiMessageId
? {
...msg,
text: "Sorry, something went wrong.",
isStreaming: false,
}
: msg
)
);
}
} finally {
setIsStreaming(false);
abortControllerRef.current = null;
}
};
const handleKeyPress = (e) => {
if (e.key === "Enter" && !e.shiftKey && !isStreaming) {
e.preventDefault();
sendMessage();
}
};
const stopStreaming = () => {
if (abortControllerRef.current) {
abortControllerRef.current.abort();
}
};
return (
<div className="min-h-screen bg-gray-100 flex items-center justify-center p-4">
<div className="bg-white rounded-lg shadow-lg w-full max-w-2xl h-[600px] flex flex-col">
{/* Header */}
<div className="bg-blue-500 text-white p-4 rounded-t-lg">
<h1 className="text-xl font-bold">Streaming AI Chat</h1>
<p className="text-blue-100">Real-time responses!</p>
</div>
{/* Messages */}
<div className="flex-1 overflow-y-auto p-4 space-y-4">
{messages.length === 0 && (
<div className="text-center text-gray-500 mt-20">
<Bot className="w-12 h-12 mx-auto mb-4 text-gray-400" />
<p>Send a message to see streaming in action!</p>
</div>
)}
{messages.map((message) => (
<div
key={message.id}
className={`flex items-start space-x-3 ${
message.isUser ? "justify-end" : "justify-start"
}`}
>
{!message.isUser && (
<div className="bg-blue-500 p-2 rounded-full">
<Bot className="w-4 h-4 text-white" />
</div>
)}
<div
className={`max-w-xs lg:max-w-md px-4 py-2 rounded-lg ${
message.isUser
? "bg-blue-500 text-white"
: "bg-gray-200 text-gray-800"
}`}
>
{message.text}
{message.isStreaming && (
<span className="inline-block w-2 h-4 bg-blue-500 ml-1 animate-pulse" />
)}
</div>
{message.isUser && (
<div className="bg-gray-500 p-2 rounded-full">
<User className="w-4 h-4 text-white" />
</div>
)}
</div>
))}
</div>
{/* Input */}
<div className="border-t p-4">
<div className="flex space-x-2">
<input
type="text"
value={input}
onChange={(e) => setInput(e.target.value)}
onKeyPress={handleKeyPress}
placeholder="Type your message..."
className="flex-1 border border-gray-300 rounded-lg px-4 py-2 focus:outline-none focus:ring-2 focus:ring-blue-500"
disabled={isStreaming}
/>
{isStreaming ? (
<button
onClick={stopStreaming}
className="bg-red-500 hover:bg-red-600 text-white px-4 py-2 rounded-lg transition-colors"
>
Stop
</button>
) : (
<button
onClick={sendMessage}
disabled={!input.trim()}
className="bg-blue-500 hover:bg-blue-600 disabled:bg-gray-300 text-white p-2 rounded-lg transition-colors"
>
<Send className="w-5 h-5" />
</button>
)}
</div>
</div>
</div>
</div>
);
}
export default StreamingChat;

Now let’s add a way to switch between regular and streaming chat. Update your src/App.jsx:

import { useState } from "react";
import StreamingChat from "./StreamingChat";
function App() {
return <StreamingChat />;
}
export default App;

  1. Start your backend: npm run dev (in your backend folder)
  2. Start your frontend: npm run dev (in your frontend folder)
  3. Send a message like “Write a short story about a cat”

You should see the AI’s response appear word by word in real-time!


IssueFix
Stream doesn’t workCheck that your backend streaming endpoint is running
Text appears all at onceMake sure you’re using /api/chat/stream not /api/chat
Console errors about AbortControllerThis is normal when canceling requests
UI doesn’t updateCheck that message IDs are unique

Your streaming chat now has:

  • ✅ Real-time text streaming like ChatGPT
  • ✅ Visual streaming indicator (cursor)
  • ✅ Ability to stop streaming mid-response
  • ✅ Proper error handling
  • ✅ Clean UI that updates in real-time

The difference in user experience is incredible! 🚀