@squidcloud/backend
    Preparing search index...

    Interface LlmServiceOptions

    Options for the @llmService decorator.

    interface LlmServiceOptions {
        models: Record<UserAiChatModelName, LlmModelMetadata>;
        supportsFunctions?: boolean;
    }
    Index

    Properties

    models: Record<UserAiChatModelName, LlmModelMetadata>

    Map of model names to their metadata.

    supportsFunctions?: boolean

    Whether this LLM service supports native function calling. If true, the service handles tool calls directly (e.g., via MCP). If false or undefined, the execution plan fallback will be used when tools are provided. Defaults to false.