Examples inside Prompts
Examples inside Prompts
The `gollm` package supports the use of examples in prompts, which can help guide the Language Model to produce more accurate and consistent responses. Examples can be added directly in the code or loaded from a file.
How it works
Examples are additional context provided to the LLM to demonstrate the desired output format or style. They can be particularly useful for:
1. Demonstrating specific formats or structures you want the LLM to follow.
2. Providing context for domain-specific tasks.
3. Guiding the tone or style of the LLM's responses.
Usage
Adding Examples Directly in Code
You can add examples directly when creating a new prompt using the `WithExamples` option:
prompt := gollm.NewPrompt("Generate a haiku about technology",
gollm.WithExamples(
"Silicon dreams flow\nCircuits pulse with ones and zeros\nInnovation blooms",
"Digital whispers\nConnecting worlds with keystrokes\nHumanity linked",
),
)
Loading Examples from a File
You can also load examples from a file using the WithExamplesFromFile
option:
prompt := gollm.NewPrompt("Generate a haiku about nature",
gollm.WithExamplesFromFile("path/to/haiku_examples.txt"),
)
The file should contain one example per line for .txt
files, or be in JSON Lines format for .jsonl
files.
Example: Creative Writing Assistant
Here's a more comprehensive example demonstrating how to use examples in a creative writing assistant:
package main
import (
"context"
"fmt"
"log"
"os"
"github.com/teilomillet/gollm"
)
func main() {
llm, err := gollm.NewLLM(
gollm.SetProvider("openai"),
gollm.SetModel("gpt-4o-mini"),
gollm.SetAPIKey(os.Getenv("OPENAI_API_KEY")),
gollm.SetMaxTokens(300),
)
if err != nil {
log.Fatalf("Failed to create LLM: %v", err)
}
ctx := context.Background()
// Example 1: Using inline examples
promptWithInlineExamples := gollm.NewPrompt(
"Write a short story about a robot discovering emotions",
gollm.WithExamples(
"Spark felt something new as it watched the sunset. A warmth in its circuits that wasn't from overheating...",
"XR-7 couldn't compute the error in its system. Why did its servos whir faster when the human smiled?",
),
gollm.WithDirectives(
"Keep the story under 100 words",
"Focus on the robot's internal experience",
),
)
response, err := llm.Generate(ctx, promptWithInlineExamples)
if err != nil {
log.Fatalf("Error generating story: %v", err)
}
fmt.Printf("Story with inline examples:\n%s\n\n", response)
// Example 2: Loading examples from a file
promptWithFileExamples := gollm.NewPrompt(
"Write a poem about the changing seasons",
gollm.WithExamplesFromFile("examples/season_poems.txt"),
gollm.WithDirectives(
"Use vivid imagery",
"Include at least one metaphor",
),
)
response, err = llm.Generate(ctx, promptWithFileExamples)
if err != nil {
log.Fatalf("Error generating poem: %v", err)
}
fmt.Printf("Poem with examples from file:\n%s\n", response)
}
In this example:
We create an LLM instance with appropriate settings.
We demonstrate two ways of using examples: a. Inline examples for a short story prompt. b. Examples loaded from a file for a poetry prompt.
We use additional directives to further guide the LLM's output.
File Format for Examples
When using WithExamplesFromFile
, the file should be formatted as follows:
For .txt
files:
Example 1 content
Example 2 content
Example 3 content
For .jsonl
files:
{"content": "Example 1 content"}
{"content": "Example 2 content"}
{"content": "Example 3 content"}
Considerations
Quality over Quantity: A few well-chosen examples are often more effective than many mediocre ones.
Relevance: Ensure your examples are closely related to the task you're asking the LLM to perform.
Variety: If possible, provide examples that cover different aspects or styles within your task.
Token Limits: Remember that examples count towards your token limit, so balance the number of examples with the space needed for the LLM's response.
Advanced Usage
For more advanced usage, including dynamic example selection or generation, refer to the examples/advanced_examples.go
file in the repository.
Last updated
Was this helpful?