1
Clone the starter repo
Chose your prefered programming language and get started with the BeeAI Framework starter template.
2
Install the dependencies
If you’re using python, make sure you have uv installed.
3
Configure your Enviorment Variables
Create an
.env
file with the contents from .env.template
4
Configure your LLM Backend
If you choose to run a local model, Ollama must be installed and running, with the granite3.3 model pulled. If you run into issues, run orIf you chose to use a hosted model, edit the Add your API key for your preferred provider to your
ollama list
to verify the model name and ensure granite3.3
is installed or that your alias points to it.shell
LLM_CHAT_MODEL_NAME
in the .env
file.example
.env
file and uncomment the line.example
5
Run the Agent
This agent is an activity planner that can help you plan your day. Prompt it with your task and location. Exit the loop by typing “q” and enter.Take a look inside the code file to understand the example agent.Congradulations! You’ve ran your first BeeAI agent.