This section provides practical examples to help you understand how to use gollm in various scenarios.
Basic Question Answering
llm, _ := gollm.NewLLM(gollm.SetProvider("openai"), gollm.SetModel("gpt-4o-mini"))
prompt := gollm.NewPrompt("What is the capital of France?")
response, _ := llm.Generate(context.Background(), prompt)
fmt.Println(response)
Summarization with Context
prompt := gollm.NewPrompt("Summarize the main points in 3 sentences.",
gollm.WithContext("The text is about the history of the Internet."),
gollm.WithMaxLength(100),
)
response, _ := llm.Generate(context.Background(), prompt)
fmt.Println(response)
Comparing Models for a Specific Task
configs := []*gollm.Config{
{Provider: "openai", Model: "gpt-4o-mini"},
{Provider: "anthropic", Model: "claude-3-5-sonnet-20240620"},
}
prompt := "Explain how a computer works to a 5-year-old."
results, _ := gollm.CompareModels(context.Background(), prompt, nil, configs...)
fmt.Println(gollm.AnalyzeComparisonResults(results))
JSON Output with Validation
type Recipe struct {
Name string `json:"name"`
Ingredients []string `json:"ingredients"`
Steps []string `json:"steps"`
}
prompt := gollm.NewPrompt("Create a recipe for chocolate chip cookies.",
gollm.WithOutput("Respond in JSON format with 'name', 'ingredients', and 'steps' fields."),
)
response, _ := llm.Generate(context.Background(), prompt, gollm.WithJSONSchemaValidation())
var recipe Recipe
json.Unmarshal([]byte(response), &recipe)
fmt.Printf("Recipe: %+v\n", recipe)
These examples demonstrate various features of gollm, from basic usage to more advanced scenarios. Experiment with these examples and adapt them to your specific needs.