qanary-component-helpers/llm-provider
This page is auto-generated from TSDoc output. Edit the source file, not this Markdown.
qanary-component-helpers/llm-provider
Section titled “qanary-component-helpers/llm-provider”- Package:
@leipzigtreechat/qanary-component-helpers - Source file:
packages/qanary-component-helpers/src/llm-provider.ts
Summary
Section titled “Summary”Environment variables controlling the LLM backend for all Qanary components:
OPENROUTER_API_KEY – API key for OpenRouter (required at call time) LLM_MODEL – model slug understood by OpenRouter (default: “deepseek/deepseek-v3.2”)
Swap provider or model at runtime without touching source code — just change the env vars in your .env file or execution environment.
Security note: the API key is read lazily (only when a model is requested)
and is never stored in a variable that outlives the factory call, mirroring
the spirit of Effect’s Redacted pattern used in the chatbot package.
It is intentionally never logged or included in error messages.
Functions
Section titled “Functions”getLlmModel
Section titled “getLlmModel”function getLlmModel(): LanguageModelReturns a configured LanguageModel instance ready for use with the
Vercel AI SDK (generateText, generateObject, …).
Throws a descriptive error when OPENROUTER_API_KEY is absent so
components fail fast and clearly on misconfiguration rather than producing
cryptic downstream errors.
Returns
LanguageModelExamples
import { getLlmModel } from "@leipzigtreechat/qanary-component-helpers";import { generateObject } from "ai";
const { object } = await generateObject({ model: getLlmModel(), schema: MySchema, prompt: "...",});Defined at: line 42
Constants
Section titled “Constants”DEFAULT_MODEL
Section titled “DEFAULT_MODEL”const DEFAULT_MODEL: stringEnvironment variables controlling the LLM backend for all Qanary components:
OPENROUTER_API_KEY – API key for OpenRouter (required at call time) LLM_MODEL – model slug understood by OpenRouter (default: “deepseek/deepseek-v3.2”)
Swap provider or model at runtime without touching source code — just change the env vars in your .env file or execution environment.
Security note: the API key is read lazily (only when a model is requested)
and is never stored in a variable that outlives the factory call, mirroring
the spirit of Effect’s Redacted pattern used in the chatbot package.
It is intentionally never logged or included in error messages.
Defined at: line 20
In qanary-component-helpers
Section titled “In qanary-component-helpers”qanary-component-helpers/apiqanary-component-helpers/baseqanary-component-helpers/commonqanary-component-helpers/configurationqanary-component-helpers/create-annotationqanary-component-helpers/create-clarification-annotationqanary-component-helpers/generate-clarification-questionqanary-component-helpers/generate-object-retryqanary-component-helpers/get-domain-instancesqanary-component-helpers/get-question-uriqanary-component-helpers/get-questionqanary-component-helpers/indexqanary-component-helpers/interfaces/question-sparql-responseqanary-component-helpers/llm-providerqanary-component-helpers/message-operationsqanary-component-helpers/query-file-loaderqanary-component-helpers/query-sparqlqanary-component-helpers/utils/question-uri-query