LLM Chat Gateway
The llmchat-gateway is an example application that demonstrates how to usage websocket-router to create a gateway for the LLM Chat Server.
It acts as a secure entry point (HTTPS) that proxies WebSocket traffic to the backend chat server.
Architecture
- Gateway: Listens on HTTPS port 8443. Serves the static UI and routes
/chatWebSocket connections to the backend. - Backend:
llmchat-serverrunning on HTTP port 8080. - Routing: Uses
WebSocketRouterHandlerto forward messages based on path or serviceId. Also usesDirectRegistryfor service discovery.
Configuration
The configuration is located in src/main/resources/config/values.yml.
Server & Handler
Configured for HTTPS on port 8443.
server.httpsPort: 8443
server.enableHttp: false
server.enableHttps: true
handler.paths:
- path: '/'
method: 'GET'
exec:
- resource
- path: '/chat'
method: 'GET'
exec:
- router
WebSocket Router
Maps requests to the backend service.
websocket-router.pathPrefixService:
/chat: com.networknt.llmchat-1.0.0
Service Registry
Uses DirectRegistry to locate the backend server (llmchat-server) at http://localhost:8080.
service.singletons:
- com.networknt.registry.Registry:
- com.networknt.registry.support.DirectRegistry
direct-registry:
com.networknt.llmchat-1.0.0:
- http://localhost:8080
Running the Example
- Start Ollama: Ensure Ollama is running.
- Start Backend:
From
light-example-4j/websocket/llmchat-server:mvn exec:java - Start Gateway:
From
light-example-4j/websocket/llmchat-gateway:
(Note: ensure you have built it first withmvn exec:javamvn clean install)
Usage
Open your browser and navigate to https://localhost:8443.
- You might see a security warning because the
server.keystoreuses a self-signed certificate. Accept it to proceed. - The chat interface is served from the gateway.
- When you click “Connect”, it opens a secure WebSocket (
wss://) to the gateway. - The gateway routes frames to
llmchat-server, which invokes the LLM and streams the response back.