MCP Tools
MCP Tools
No MCP tools are currently available for this plugin.
Configuration
You can configure the LLM plugin using theunpage configure
command, which will guide you through selecting a provider and model.
~/.unpage/profiles/<profile_name>/config.yaml
:
Model Selection
The LLM plugin supports the following model providers and recommended models:OpenAI
Anthropic
Amazon Bedrock
us-east-1
region if no region is set in the environment.
Advanced Configuration
The LLM plugin uses sensible defaults for most settings:temperature
: 0 (most deterministic responses)max_tokens
: Set to the model’s maximum context lengthcache
: true (enables response caching for efficiency)
Supported Models
In addition to the recommended models listed above, the LLM plugin supports all models supported by LiteLLM. You can view the full list of supported models in the LiteLLM documentation. To use a model not included in the interactive configuration, directly edit your config.yaml file with the appropriate provider/model string format.Environment Variables
You can also configure the LLM plugin using environment variables:OPENAI_API_KEY
: For OpenAI modelsANTHROPIC_API_KEY
: For Anthropic modelsAWS_ACCESS_KEY_ID
andAWS_SECRET_ACCESS_KEY
: For Amazon Bedrock modelsAWS_REGION
orAWS_DEFAULT_REGION
: Region for Amazon Bedrock (defaults tous-east-1
)