AI
Connect your AI agents to additional LLM providers, embedding services, and external tools beyond the built-in models.
Integration-based model connectors
By default, Squid AI agents support models from OpenAI, Anthropic, Google Gemini, and Grok (xAI). Integration-based model connectors let you go further by connecting to any additional LLM provider or self-hosted model.
Common use cases include:
- Self-hosted models: Connect to locally-hosted models through tools like Ollama or vLLM
- AWS Bedrock: Access models available through AWS Bedrock, including Amazon Nova and other providers
- Custom endpoints: Connect to any service that exposes an OpenAI-compatible API
- Custom embeddings: Use your own embedding models for knowledge base indexing and retrieval
How integration-based models work
Once you add an AI connector in the Squid Console, you can reference it by its connector ID when configuring an AI agent. Specify the integrationId (your connector ID) and the model name you configured:
await squid.ai().agent('my-agent').updateModel({
integrationId: 'my-connector-id',
model: 'model-name',
});
To learn more about configuring AI agents with custom models, see the AI agent documentation.
MCP (Model Context Protocol)
While integration-based model connectors expand the LLMs available to your agents, the MCP connector expands what your agents can do. MCP is an open protocol that lets Squid AI agents access tools and perform actions through external services.
With an MCP connector, your agents can connect to any MCP server, whether it's a custom Squid MCP server you build or a third-party MCP server, and use the tools it exposes. This is the primary way to give your agents capabilities beyond conversation, such as querying databases, calling APIs, or triggering workflows.
To get started, see the MCP connector setup guide.
Available AI connectors
| Connector | Description |
|---|---|
| OpenAI Compatible Chat | Connect to any OpenAI-compatible chat API |
| OpenAI Compatible Embedding | Use any OpenAI-compatible embedding API for knowledge base indexing |
| AWS Bedrock | Connect to AWS Bedrock for access to Amazon and third-party models |
| MCP | Connect to MCP servers to provide tools and actions for AI agents |