Skip to content

Getting Started

This guide gets you from a fresh clone to a working local Baumbart chatbot. The main target is the browser-based chatui, backed by the full Qanary pipeline and the Leipzig tree knowledge graph.

Baumbart is not a single process. A working local setup needs:

  • Docker services for Virtuoso, the Qanary pipeline, and the Leipzig tree knowledge graph
  • Local Bun processes for the chat UI, shared packages, and the Qanary components
  • An OpenRouter API key for the LLM-backed parts of the system

Before you start, make sure you have:

  • Bun >= 1.3.2
  • Docker with Compose
  • An OpenRouter API key

From the repository root, install all workspace dependencies:

Terminal window
bun install

The fastest path is the helper script from the repository root. It copies any missing .env files, writes the shared local defaults, and switches the Qanary components to host.docker.internal so the Docker pipeline can reach the locally running component processes:

Terminal window
OPENROUTER_API_KEY=sk-or-v1-... bun run env:setup:docker

Replace sk-or-v1-... with your real key when you run it. The helper also writes the root .env used by Docker Compose and the benchmark command, and it rejects keys that do not look like OpenRouter keys. If VIRTUOSO_DBA_PASSWORD is not already set, it generates one for you.

If you want to keep the component .env files in host-local mode instead, run:

Terminal window
OPENROUTER_API_KEY=sk-or-v1-... bun run env:setup

If you prefer to do the setup manually, create local .env files from the checked-in examples:

Terminal window
cp apps/chatui/.env.example apps/chatui/.env
cp apps/chatcli/.env.example apps/chatcli/.env
cp apps/qanary-component-eat-simple/.env.example apps/qanary-component-eat-simple/.env
cp apps/qanary-component-nerd-simple/.env.example apps/qanary-component-nerd-simple/.env
cp apps/qanary-component-dis/.env.example apps/qanary-component-dis/.env
cp apps/qanary-component-relation-detection/.env.example apps/qanary-component-relation-detection/.env
cp apps/qanary-component-sparql-generation/.env.example apps/qanary-component-sparql-generation/.env

You also need a root-level .env file:

Terminal window
cat > .env <<'EOF'
VIRTUOSO_DBA_PASSWORD=your-secure-password
OPENROUTER_API_KEY=sk-or-v1-...
QANARY_API_BASE_URL=http://localhost:8080
TRIPLESTORE_URL=http://localhost:8890/sparql
EOF

Update the new .env files with real values:

  • In apps/chatui/.env and apps/chatcli/.env:
    • OPENROUTER_API_KEY must contain a valid OpenRouter key
    • QANARY_API_BASE_URL should stay http://localhost:8080
    • TRIPLESTORE_URL should stay http://localhost:8890/sparql
  • In every apps/qanary-component-*/.env:
    • SPRING_BOOT_ADMIN_URL should stay http://localhost:8080/
    • QANARY_PORT should keep the port from the example file
    • OPENROUTER_API_KEY must be filled in for the LLM-backed components
  • In apps/qanary-component-sparql-generation/.env:
    • NEOGEOCODER_EMAIL must be set to a valid email address for OpenStreetMap geocoding requests
  • In the root .env:
    • VIRTUOSO_DBA_PASSWORD is required by docker compose
    • OPENROUTER_API_KEY is reused by the benchmark command
    • QANARY_API_BASE_URL should stay http://localhost:8080
    • TRIPLESTORE_URL should stay http://localhost:8890/sparql

The Qanary pipeline runs in Docker, but the Qanary components run on your host machine during local development. Because of that, each component .env should use:

Terminal window
QANARY_HOST=host.docker.internal

If QANARY_HOST is left as localhost, the pipeline container will try to call itself instead of your locally running component.

The helper command bun run env:setup:docker performs this change automatically.

Start the Docker services from the repository root:

Terminal window
docker compose up -d virtuoso qanary_pipeline leipzig-tree-knowledge-graph

This starts:

  • virtuoso on port 8890 for Qanary annotations and SPARQL reads
  • qanary_pipeline on port 8080 to orchestrate the question-answering flow
  • leipzig-tree-knowledge-graph on port 8000 as the domain knowledge base

In a second terminal, start the monorepo dev processes:

Terminal window
bun run dev

This is the main local development path. It starts the workspace watchers, chatui, and the Qanary components needed by the pipeline.

Open the Vite URL shown in the terminal. In a standard local setup that will be:

http://localhost:5173

When the page loads, you should see the Baumbart greeting and an active chat input.

Use this sample question from the repository:

Wie viel wurde im Stadtteil Connewitz gegossen?

Your local setup is working if:

  • the page loads
  • the initial Baumbart greeting is visible
  • your message is accepted by the UI
  • the chatbot returns an answer instead of a pipeline, websocket, or configuration error

If you want to verify the chatbot outside the browser, you can also run the CLI:

Terminal window
bun run dev --filter=chatcli

If you want to benchmark the live chatbot with the curated demo question set, run:

Terminal window
bun run benchmark:demo

For the benchmark-specific workflow and report format, see Running the Demo Benchmark.

  • Missing OPENROUTER_API_KEY: the chat UI or one of the LLM-backed components will fail when it tries to generate or classify text.
  • QANARY_HOST=localhost: the Qanary pipeline container cannot reach locally running components. Use host.docker.internal instead.
  • Docker services not healthy: make sure ports 8080, 8890, and 8000 are reachable and that docker compose ps shows the services as running.
  • Component ports not available: the local Qanary components need ports 40500 to 40504. If one of these ports is occupied, the affected component will not start correctly.