gollm
  • Introduction
  • Getting started
    • Getting started
    • Basic usage
    • Configuration
    • Memory
    • Examples inside Prompts
  • Prompt-template
    • Sumarize
    • Question Answering
    • Chain of Thought
    • Structured Output
  • advanced features
    • Advanced features
    • Prompt engineering
    • Comparison
    • JSON output validation
    • Prompt Optimization
  • Examples
    • Example
    • 1. Basic usage
    • 2. Prompt types
    • 3. Compare providers
    • 4. Custom config
    • 5. Advanced Prompt
    • 6. Structured output
    • 7. Structured output comparison
    • 8. Content creation workflo
    • Ollama Example
    • Prompt Optimizer
    • Batch Prompt Optimizer
  • API reference
  • FAQ
Powered by GitBook
On this page

Was this helpful?

  1. Getting started

Getting started

Getting Started with gollm

This guide will help you set up and start using gollm in your Go projects.

Installation

To install gollm, use the following command:

go get github.com/teilomillet/gollm

Basic Setup

Here's a simple example to get you started:

package main

import (
	"context"
	"fmt"
	"log"

	"github.com/teilomillet/gollm"
)

func main() {
	llm, err := gollm.NewLLM(
		gollm.SetProvider("openai"),
		gollm.SetModel("gpt-4o-mini"),
		gollm.SetMaxTokens(100),
		gollm.SetAPIKey("your-api-key-here"),
	)
	if err != nil {
		log.Fatalf("Failed to create LLM: %v", err)
	}

	ctx := context.Background()

	prompt := gollm.NewPrompt("Tell me a short joke about programming.")
	response, err := llm.Generate(ctx, prompt)
	if err != nil {
		log.Fatalf("Failed to generate text: %v", err)
	}
	fmt.Printf("Response: %s\n", response)
}

This example demonstrates how to create an LLM instance, set up a simple prompt, and generate a response.

PreviousIntroductionNextBasic usage

Last updated 10 months ago

Was this helpful?