LLM Support
Use various LLMs to handle your agents and tasks.
Overview
The BeaconLabs framework allows you to use .env variable files or environment variables for LLM support. Once you provide the necessary keys for various services, you can easily select the LLM by using the model
parameter within the agents.
The supported LLMs are:
OpenAI
openai/gpt-4o
openai/o3-mini
Azure
azure/gpt-4o
Anthropic
claude/claude-3-5-sonnet
AWS Bedrock
bedrock/claude-3-5-sonnet
DeepSeek
deepseek/deepseek-chat
Setting up .env To use these LLMs, the example .env variable file is shown below. You can adjust the variables according to the LLM you want to use and fill in only the required ones. This .env file must be located in your working directory.
Using a Specific Model in Agent
Using the model
parameter, you can easily select which LLM each agent will use. Here's an example:
Last updated