Batch Prompt Optimizer
This example demonstrates the usage of the BatchPromptOptimizer in gollm. Here's a detailed breakdown of the code:
Setting up the environment: The example starts by creating an LLM client using the Groq provider, similar to the previous example:
Creating the BatchPromptOptimizer: The code creates a new BatchPromptOptimizer instance:
This optimizer is set to verbose mode, which will provide detailed output during the optimization process.
Defining prompt examples: The code defines a slice of PromptExample structs, each representing a different prompt to be optimized:
Each PromptExample includes a name, prompt, description, threshold, and custom metrics.
Running batch optimization: The code runs the batch optimization process:
This optimizes all the prompts in the examples slice simultaneously.
Processing and displaying results: The code then iterates through the optimization results, displaying information for each optimized prompt:
For each result, it prints:
The name of the example
The original prompt
Either an error message (if optimization failed) or:
The optimized prompt
The content generated using the optimized prompt
This example showcases how to use the BatchPromptOptimizer to optimize multiple prompts simultaneously. It demonstrates setting up the optimizer, defining prompt examples with custom metrics, running the batch optimization process, and handling the results.
Last updated