Ollama Example
Using Ollama with gollm
This guide describes how to use Ollama with the gollm library.
Usage example
Configuration Options
When creating a new LLM instance with Ollama, you can use the following configuration options:
gollm.SetProvider("ollama")
: Specifies Ollama as the provider.gollm.SetModel(modelName)
: Sets the Ollama model to use (e.g., "llama3.1").gollm.SetDebugLevel(level)
: Sets the debug level for logging.gollm.SetOllamaEndpoint(endpoint)
: Sets a custom Ollama API endpoint (optional).
Important Notes
Server Requirement: Ensure that the Ollama server is running before executing your Go program. Start it with:
Model Availability: Make sure you've pulled the model you want to use before trying to use it in your program:
Custom Endpoint: Use
gollm.SetOllamaEndpoint()
to specify a custom Ollama API endpoint if you're not using the defaulthttp://localhost:11434
.Error Handling: Always check for errors when creating the LLM instance and generating responses.
By following this guide, you can easily use Ollama with gollm in your Go programs. The configuration options allow you to customize the Ollama setup according to your needs.
Last updated