2. Prompt types

  1. Setting up the LLM: The example starts by setting up the LLM instance with specific configurations:

    llm, err := gollm.NewLLM(
        gollm.SetProvider("openai"),
        gollm.SetModel("gpt-4o-mini"),
        gollm.SetAPIKey(apiKey),
        gollm.SetMaxTokens(300),
        gollm.SetMaxRetries(3),
        gollm.SetDebugLevel(gollm.LogLevelInfo),
    )

    This configuration uses OpenAI's GPT-3.5-turbo model, sets a maximum of 300 tokens for responses, enables up to 3 retries, and sets the debug level to Info.

  2. Example 1: Basic Prompt with Structured Output This example demonstrates creating a simple prompt that requests structured output:

    basicPrompt := gollm.NewPrompt("List the top 3 benefits of exercise",
        gollm.WithOutput("JSON array of benefits, each with a 'title' and 'description'"),
    )

    The WithOutput option specifies that the response should be in a specific JSON format.

  3. Example 2: Prompt with Directives, Output, and Context This example shows how to create a more complex prompt with multiple components:

    directivePrompt := gollm.NewPrompt("Propose a solution to reduce urban traffic congestion",
        gollm.WithDirectives(
            "Consider both technological and policy-based approaches",
            "Address environmental concerns",
            "Consider cost-effectiveness",
        ),
        gollm.WithOutput("Solution proposal in markdown format with headings"),
        gollm.WithContext("The city has a population of 2 million and limited public transportation."),
    )

    This prompt includes specific directives for the AI to follow, specifies the output format, and provides context for the problem.

  4. Example 3: Prompt with Examples and Max Length This example demonstrates how to provide examples and limit the response length:

    examplesPrompt := gollm.NewPrompt("Write a short, engaging tweet about climate change",
        gollm.WithExamples(
            "🌍 Small actions, big impact! Reduce, reuse, recycle to fight climate change. #ClimateAction",
            "🌡️ Climate change is real, and it's happening now. Let's act before it's too late! #ClimateEmergency",
        ),
        gollm.WithMaxLength(30),
    )

    The WithExamples option provides sample responses to guide the AI, while WithMaxLength limits the response to 30 tokens.

  5. Example 4: Prompt Template with Dynamic Content This example showcases the use of prompt templates for dynamic content:

    templatePrompt := gollm.NewPromptTemplate(
        "ProductDescription",
        "Generate a product description",
        "Create an engaging product description for a {{.ProductType}} named '{{.ProductName}}'. "+
            "Target audience: {{.TargetAudience}}. Highlight {{.NumFeatures}} key features.",
        gollm.WithPromptOptions(
            gollm.WithDirectives(
                "Use persuasive language",
                "Include a call-to-action",
            ),
            gollm.WithOutput("Product description in HTML format"),
        ),
    )

    This template allows for dynamic insertion of product details and includes additional options like directives and output format.

  6. Example 5: JSON Schema Generation and Validation This example demonstrates how to generate a JSON schema for a prompt and validate prompts:

    schemaPrompt := gollm.NewPrompt("Generate a user profile",
        gollm.WithOutput(`JSON object with name, age, and interests`),
        gollm.WithDirectives(
            "Name should be a string",
            "Age should be an integer",
            "Interests should be an array of strings",
        ),
    )
    schemaBytes, err := schemaPrompt.GenerateJSONSchema()

    It also shows how to validate prompts using the Validate() method.

  7. Example 6: Chained Prompts This example demonstrates how to chain multiple prompts together:

    ideaPrompt := gollm.NewPrompt("Generate a unique business idea in the sustainability sector")
    ideaResponse, err := llm.Generate(ctx, ideaPrompt)
    
    analysisPrompt := gollm.NewPrompt(fmt.Sprintf("Analyze the following business idea: %s", ideaResponse),
        gollm.WithDirectives(
            "Identify potential challenges",
            "Suggest target market",
            "Propose a monetization strategy",
        ),
        gollm.WithOutput("Analysis in JSON format with 'challenges', 'targetMarket', and 'monetization' keys"),
    )

    This chains two prompts: one to generate a business idea, and another to analyze that idea.

  8. Example 7: Prompt with JSON Schema Validation This final example shows how to use JSON schema validation with a prompt:

    jsonSchemaPrompt := gollm.NewPrompt("Generate a user profile",
        gollm.WithOutput(`JSON object with the following schema:
        {
            "type": "object",
            "properties": {
                "name": {"type": "string"},
                "age": {"type": "integer", "minimum": 18},
                "interests": {"type": "array", "items": {"type": "string"}}
            },
            "required": ["name", "age", "interests"]
        }`),
    )
    jsonSchemaResponse, err := llm.Generate(ctx, jsonSchemaPrompt, gollm.WithJSONSchemaValidation())

    This example defines a JSON schema for the expected output and uses the WithJSONSchemaValidation() option to ensure the response conforms to this schema.

In summary, this example file showcases a wide range of gollm's capabilities, including structured outputs, complex prompts with directives and context, prompt templates, JSON schema generation and validation, chained prompts, and JSON schema validation for responses. These features demonstrate the flexibility and power of the gollm package for various natural language processing tasks.

Last updated