Frontend Development 12 min read

Implementing Server-Sent Events (SSE) for Real‑Time ChatGPT Streaming in Frontend and Node.js Backend

This article explains how to use Server‑Sent Events (SSE) and the EventSource API to stream ChatGPT responses in real time, covering both browser‑side JavaScript examples and Node.js backend proxy code, along with configuration tips for Nginx and compatibility notes.

TAL Education Technology
TAL Education Technology
TAL Education Technology
Implementing Server-Sent Events (SSE) for Real‑Time ChatGPT Streaming in Frontend and Node.js Backend

Background

The author observed the character‑by‑character typing effect in ChatGPT and discovered that the service uses a streaming response, which can be accessed via the text/event-stream MIME type.

Concepts

Server‑Sent Events (SSE) are a one‑way communication mechanism where the server pushes messages to the client over an HTTP connection. The browser provides the EventSource API to receive these messages. Compared with WebSocket, SSE is simpler but only supports server‑to‑client communication.

Front‑end usage

Creating an EventSource object points to a URL that returns an SSE stream. Example:

const evtSource = new EventSource("ssedemo.php");

Listening to generic messages:

evtSource.onmessage = function(event) {
  const newElement = document.createElement("li");
  const eventList = document.getElementById("list");
  newElement.innerHTML = "message: " + event.data;
  eventList.appendChild(newElement);
};

Listening to a named event (e.g., ping ):

evtSource.addEventListener("ping", function(event) {
  const newElement = document.createElement("li");
  const time = JSON.parse(event.data).time;
  newElement.innerHTML = "ping at " + time;
  eventList.appendChild(newElement);
});

Back‑end (Node.js) implementation

The server must set Content‑Type: text/event-stream and write messages using res.write . A minimal example:

module.exports = async function (req, res) {
  const id = uuid();
  const sendEvent = (data) => {
    const retry = 2000;
    const event = 'chatdata';
    const msg = `event:${event}\nid:${id}\nretry:${retry}\ndata: ${JSON.stringify(data)}\n\n`;
    res.write(msg);
  };
  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');
  res.setHeader('Connection', 'keep-alive');
  let count = 0;
  const timer = setInterval(() => {
    if (count > 10) { clearInterval(timer); sendEvent({ message: '[DONE]' }); return; }
    count++;
    sendEvent({ message: `{"count": "${count}"}` });
  }, 500);
};

Event‑stream format

Each SSE message consists of fields such as event , data , id , and retry , separated by newlines and terminated by an empty line. Lines beginning with a colon are comments and ignored.

ChatGPT proxy layer

The article shows how to wrap OpenAI's streaming API in a Node.js service that forwards the SSE to the browser. The proxy sets the same SSE headers, forwards the streamed response from OpenAI, and allows custom processing of each chunk.

const axios = require('axios');
const { Readable } = require('stream');
module.exports = async function (req, res) {
  const sendEvent = (data) => { res.write(`data: ${JSON.stringify(data)}\n\n`); };
  res.setHeader('Content-Type', 'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');
  res.setHeader('Connection', 'keep-alive');
  const requestOptions = {
    method: 'post',
    url: 'https://api.openai.com/v1/chat/completions',
    headers: {
      'Content-Type': 'application/json',
      'Authorization': `Bearer ${process.env.OPENAI_API_KEY}`,
    },
    data: {
      model: "gpt-3.5-turbo",
      messages: [{ role: 'user', content: '你好' }],
      max_tokens: 2048,
      stream: true,
    },
    responseType: 'stream',
  };
  axios(requestOptions).then((response) => {
    const dataStream = Readable.from(response.data);
    dataStream.on('data', (chunk) => { sendEvent({ message: chunk.toString() }); });
    dataStream.on('end', () => { dataStream.destroy(); });
  }).catch((error) => { /* handle error */ });
};

Frontend consumption of the proxy

export default function ChatMain(props) {
  const [contentStream, setContentStream] = useState('');
  async function onSubmit(event, text) {
    const user = 'user';
    const msg = text;
    const params = `user=${user}&content=${msg}`;
    const source = new EventSource(`/api/chatStream?${params}`, { withCredentials: true });
    let textStream = '';
    source.onmessage = function(event) {
      const sourceData = JSON.parse(event.data);
      if (sourceData.message === '[DONE]') { source.close(); return; }
      const list = sourceData.message.split('\n\n') || [];
      list.forEach(item => {
        if (item.startsWith('data:')) {
          const resultData = JSON.parse(item.substr(6));
          const chunkData = resultData?.choices?.[0]?.delta?.content;
          if (chunkData) { textStream += chunkData; setContentStream(textStream); }
        }
      });
    };
    source.onerror = (error) => { console.error(error); };
  }
}

Usage scenarios & notes

SSE is suitable for real‑time chat, financial data, stock quotes, weather updates, etc., where only server‑to‑client push is needed. When deploying behind Nginx, ensure the proxy disables buffering and keeps the connection alive:

location /api/ {
  proxy_pass http://xxxx.com;
  proxy_set_header Connection "";
  proxy_http_version 1.1;
  proxy_buffering off;
  proxy_cache off;
}

Browser compatibility is good across modern browsers, as SSE is an HTML5 feature.

backendFrontendStreamingNode.jsChatGPTServer-Sent EventsEventSource
TAL Education Technology
Written by

TAL Education Technology

TAL Education is a technology-driven education company committed to the mission of 'making education better through love and technology'. The TAL technology team has always been dedicated to educational technology research and innovation. This is the external platform of the TAL technology team, sharing weekly curated technical articles and recruitment information.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.