Skip to content

Streamlit chat in Docker Compose

This example runs a tiny Streamlit chat UI next to the gateway in one docker compose project.

Best for

Use this example when you want a minimal browser UI instead of a CLI or backend-only demo.

What it demonstrates

  • a small chat interface talking to LunarGate through the OpenAI-compatible API
  • streaming responses rendered live in the UI
  • a gateway container built from a small wrapper image with baked-in config.yaml
  • a compose-local setup that teammates can run with one command

Run it

cp .env.example .env
cp config-simple.yaml.example config.yaml
docker compose up --build

Then open:

http://127.0.0.1:8501

What you get

  • gateway: http://127.0.0.1:8080
  • Streamlit UI: http://127.0.0.1:8501

What to inspect

  • app/app.py for the Streamlit chat flow
  • app/Dockerfile for the UI container build
  • docker-compose.yml for service wiring
  • gateway.Dockerfile for the same remote-Docker-safe wrapper image pattern used by the smaller compose example

Why this example matters

A lot of gateway demos stop at CLI requests. This one shows the next step:

  • keep the gateway self-hosted
  • keep the app simple
  • still get a usable chat UX for demos, testing, and internal prototypes

Good companion pages

  • Read Node Express with streaming if you want to compare browser-oriented UI streaming with backend SSE forwarding.
  • Read HTTP API if you want to understand the OpenAI-compatible streaming surface underneath the UI.