1. Basic usage

  1. Setting up the environment: The example starts by checking for the OPENAI_API_KEY environment variable. This is crucial for authenticating with the OpenAI API.

    apiKey := os.Getenv("OPENAI_API_KEY")
    if apiKey == "" {
        log.Fatalf("OPENAI_API_KEY environment variable is not set")
    }
  2. Creating an LLM instance: The example demonstrates how to create a new LLM instance with custom configuration using the gollm.NewLLM() function and various configuration options.

    llm, err := gollm.NewLLM(
        gollm.SetProvider("openai"),
        gollm.SetModel("gpt-3.5-turbo"),
        gollm.SetAPIKey(apiKey),
        gollm.SetMaxTokens(200),
        gollm.SetMaxRetries(3),
        gollm.SetRetryDelay(time.Second*2),
        gollm.SetDebugLevel(gollm.LogLevelInfo),
    )

    This configuration:

    • Sets the provider to OpenAI

    • Uses the GPT-3.5-turbo model

    • Sets the API key

    • Limits the response to 200 tokens

    • Configures retry behavior (3 retries with a 2-second delay)

    • Sets the debug level to Info

  3. Basic Prompt: The example demonstrates creating a simple prompt and generating a response.

    basicPrompt := gollm.NewPrompt("Explain the concept of 'recursion' in programming.")
    response, err := llm.Generate(ctx, basicPrompt)

Example 2: Advanced Prompt

This example showcases creating a more complex prompt with directives and output specifications.

advancedPrompt := gollm.NewPrompt("Compare and contrast functional and object-oriented programming paradigms",
    gollm.WithDirectives(
        "Provide at least three key differences",
        "Include examples for each paradigm",
        "Discuss strengths and weaknesses",
    ),
    gollm.WithOutput("Comparison of Programming Paradigms:"),
    gollm.WithMaxLength(300),
)

This advanced prompt:

  • Asks for a comparison of programming paradigms

  • Provides specific directives for the AI to follow

  • Sets an output prefix

  • Limits the response length to 300 tokens

Example 3: Prompt with Context

This example demonstrates how to provide context to a prompt:

contextPrompt := gollm.NewPrompt("Summarize the main points",
    gollm.WithContext("The Internet of Things (IoT) refers to the interconnected network of physical devices embedded with electronics, software, sensors, and network connectivity, which enables these objects to collect and exchange data."),
    gollm.WithMaxLength(100),
)

This prompt:

  • Asks for a summary

  • Provides context about IoT

  • Limits the response to 100 tokens

Example 4: JSON Schema Generation and Validation

This part of the example shows how to generate a JSON schema for a prompt and validate prompts:

schemaBytes, err := advancedPrompt.GenerateJSONSchema()

It also demonstrates validation by creating an invalid prompt:

invalidPrompt := gollm.NewPrompt("") // Invalid because Input is required
err = invalidPrompt.Validate()

Example 5: Using Chain of Thought

The final example demonstrates the use of the Chain of Thought feature:

cotPrompt := "Explain the process of photosynthesis step by step."
cotResponse, err := gollm.ChainOfThought(ctx, llm, cotPrompt)

This feature helps in generating step-by-step explanations for complex topics.

In summary, this example file provides a comprehensive overview of gollm's basic usage, showcasing various prompt types, configuration options, and advanced features like JSON schema validation and Chain of Thought reasoning.

Last updated