Configuration
Configuring gollm
gollm offers flexible configuration options to customize your LLM interactions. Here are the main ways to configure gollm:
Environment Variables
You can set the following environment variables:
GOLLM_PROVIDER
: The default LLM provider (e.g., "openai", "anthropic")GOLLM_MODEL
: The default model to useGOLLM_API_KEY
: Your API key for the chosen providerGOLLM_MAX_TOKENS
: Maximum number of tokens for the response
Code-based Configuration
Use configuration options when creating a new LLM instance:
Configuration File
You can also use a JSON configuration file:
Load the configuration file in your code:
Choose the configuration method that best suits your project's needs and structure.
Last updated