Deploy MCP Server
AI & ML Bearer Token

Mistral AI REST API

Open-weight AI models with frontier-level performance

Mistral AI provides state-of-the-art large language models through a simple REST API. Developers use Mistral for chat completions, embeddings, and function calling with models like Mistral Large, Mistral Small, and open-weight alternatives. The API offers streaming responses, JSON mode, and tool integration for building AI-powered applications with competitive pricing and performance.

Base URL https://api.mistral.ai/v1

API Endpoints

MethodEndpointDescription
POST/chat/completionsGenerate chat completions using Mistral models with support for streaming, function calling, and JSON mode
POST/embeddingsCreate vector embeddings from text using Mistral's embedding models for semantic search and RAG applications
GET/modelsList all available Mistral models with their capabilities, pricing, and context window sizes
POST/fim/completionsFill-in-the-middle completions for code generation and completion tasks with Codestral models
POST/agents/completionsExecute agent workflows with function calling and tool use for autonomous task execution
GET/models/{model_id}Retrieve detailed information about a specific Mistral model including parameters and capabilities
POST/moderationsAnalyze content for policy violations, harmful content, and safety guardrails
POST/classifiersClassify text into predefined categories using Mistral's classification capabilities
GET/filesList uploaded files used for fine-tuning and retrieval augmented generation
POST/filesUpload files for fine-tuning datasets or document retrieval in RAG applications
DELETE/files/{file_id}Delete a previously uploaded file from Mistral storage
POST/fine_tuning/jobsCreate a fine-tuning job to customize Mistral models on your specific dataset
GET/fine_tuning/jobsList all fine-tuning jobs with their status, metrics, and configuration
GET/fine_tuning/jobs/{job_id}Get detailed status and metrics for a specific fine-tuning job
DELETE/fine_tuning/jobs/{job_id}Cancel an in-progress fine-tuning job

Code Examples

curl https://api.mistral.ai/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $MISTRAL_API_KEY" \
  -d '{
    "model": "mistral-large-latest",
    "messages": [
      {
        "role": "user",
        "content": "What is the best French cheese?"
      }
    ],
    "temperature": 0.7,
    "max_tokens": 1000
  }'

Connect Mistral AI to AI

Deploy a Mistral AI MCP server on IOX Cloud and connect it to Claude, ChatGPT, Cursor, or any AI client. Your AI assistant gets direct access to Mistral AI through these tools:

mistral_chat Generate conversational responses using Mistral's chat models with context awareness and streaming support
mistral_embed Create semantic embeddings for text to enable similarity search, clustering, and retrieval augmented generation
mistral_function_call Execute structured function calls with Mistral models for tool use, API integration, and autonomous agents
mistral_code_complete Generate code completions and fill-in-the-middle suggestions using Codestral models for development workflows
mistral_moderate Analyze and classify content for safety, policy compliance, and harmful content detection with Mistral's moderation system

Deploy in 60 seconds

Describe what you need, AI generates the code, and IOX deploys it globally.

Deploy Mistral AI MCP Server →

Related APIs