mistral-client
HTTP client for Mistral AI written in Go.
Requirements
- Go 1.25 or higher
Installation
Features
Basic features:
- Chat Completion: Synchronous and streaming support.
- Embeddings: Generate text embeddings with various encoding formats and dimensions.
- Tool Calling: Native support for function calling and tool usage.
- Multi-modal Input: Handle images, audio, and documents in your messages.
- Structured Output: Support for JSON Mode and JSON Schema.
- Advanced Client: Built-in retry logic, rate limiting, and custom HTTP client configuration.
- Model Management: List, search, and retrieve details for Mistral models.
What makes a difference:
- Caching: Cache responses to avoid unnecessary repeated API calls (e.g. for local development runs).
- MLflow integration: Get your prompts from MLflow Prompt Registry instead of writing them in your code.
Coming soon:
- Fake models: Use fake models for local development and testing.
Getting Started
To start using the Mistral client, you first need to create an instance of the client with your API key:
You can also customize the client with various options:
client := mistral.New(apiKey,
mistral.WithClientTimeout(60*time.Second),
mistral.WithRetry(4, 1*time.Second, 3*time.Second),
)
Basic Usage
Chat Completion
req := mistral.NewChatCompletionRequest("mistral-small-latest", []mistral.ChatMessage{
mistral.NewUserMessageFromString("Hello! How are you today?"),
})
res, err := client.ChatCompletion(context.Background(), req)
if err != nil {
log.Fatal(err)
}
fmt.Println(res.AssistantMessage().Content().String())
Tool Calling
tools := []mistral.Tool{
mistral.NewTool("get_weather", "Get the weather for a location", mistral.PropertyDefinition{
Type: "object",
Properties: map[string]mistral.PropertyDefinition{
"location": {Type: "string", Description: "The city and state, e.g. San Francisco, CA"},
},
}),
}
req := mistral.NewChatCompletionRequest("mistral-small-latest", []mistral.ChatMessage{
mistral.NewUserMessageFromString("What's the weather in Paris?"),
}, mistral.WithTools(tools))
res, err := client.ChatCompletion(context.Background(), req)
Embeddings
req := mistral.NewEmbeddingRequest("mistral-embed", []string{"Mistral AI is awesome!"})
res, err := client.Embeddings(context.Background(), req)
if err != nil {
log.Fatal(err)
}
for _, vector := range res.Embeddings() {
fmt.Println(vector)
}
Examples
You can find more detailed examples in the examples folder.
Interacting with Mistral models:
- Chat Completion: Basic usage of the chat completion API.
- Chat Completion (Advanced): Advanced options like retry, rate limiting, and timeout.
- Chat completion (with structured output): Constrain the model output to a specific schema.
- Chat Audio: Transcribe and interact with audio files.
- Chat Vision: Interact with images.
- Chat Streaming: Stream responses from the API.
- Embeddings: Generate text embeddings.
- Tools / Function Calling: Use tools and function calling.
- Tools with streaming: Use tools and function calling with streaming.
Model discovery:
- Get Model: Retrieve details for a specific model.
- List Models: List and search available models.
Specific features:
- Caching: Cache responses to avoid unnecessary repeated API calls.
- MLflow integration: Get your prompt from MLflow Prompt Registry.
- Use Go templates in your prompts: Write your prompts with the Go template and Sprig syntax and render them in your code.