These are the first steps to setup and use the “Bee Agent Framework” in combination with watsonx.ai.
TLDR; What are AI Agents?
AI agents are autonomous software entities designed to perceive their environment, reason about it, and take actions to achieve specific goals.
Key Characteristics of AI Agents:
Autonomy: They operate independently without continuous human guidance.
Perception: They gather data from their environment using sensors or inputs (e.g., cameras, text data, or API streams).
Reasoning: They use logical or learned rules to interpret data and make decisions.
Learning: Some AI agents can improve their performance over time using feedback (reinforcement learning) or by analyzing data (supervised or unsupervised learning).
Action: They perform specific tasks or make decisions to influence their environment, such as controlling a robot, making predictions, or executing trades.
What is Bee Agent Framework?
Bee framework is an open-source framework for building, deploying, and serving powerful agentic workflows at scale.
The Bee Agent Framework makes it easy to build scalable agent-based workflows with your model of choice. The framework is been designed to perform robustly with IBM Granite and Llama 3.x models, and we’re actively working on optimizing its performance with other popular LLMs.
My first steps of installation and using the framework with watsonx.ai
There are two main repositories to use and begin writing agents;
Resources:
- Starter: github.com/i-am-bee/bee-agent-framework-starter
- Framework: github.com/i-am-bee/bee-agent-framework
I followed the clear instructions step-by-step as follows;
git clone https://github.com/i-am-bee/bee-agent-framework-starter.git
cd bee-agent-framework-starter
#if using NVM (my case)
nvm install
#installing the dependencies
npm ci
###if you encounter a problem with previous setups
sudo chown -R 501:20 "/Users/your-user-profile/.npm"
npm ci
### end of troubleshooting
Building the ‘.env’ file;
# LLM Provider (watsonx/ollama/openai/groq/bam)
LLM_BACKEND=watsonx
## watsonx specific information
WATSONX_API_KEY="YOUR-API-KEY"
WATSONX_PROJECT_ID="YOUR-PROJECT-ID"
WATSONX_MODEL="meta-llama/llama-3-1-70b-instruct"
WATSONX_REGION="us-south"
# Framework related
BEE_FRAMEWORK_LOG_PRETTY=true
BEE_FRAMEWORK_LOG_LEVEL="info"
BEE_FRAMEWORK_LOG_SINGLE_LINE="false"
Skip this section if you know how to obtain WATSONX_API_KEY and WATSONX_PROJECT_ID
How get your WATSONX_PROJECT_ID: the easiest way is when you’re in your ‘watsonx.ai’ instance select the code generation icon and then copy/paste the project ID.
First step; select the icon to see the code structure generated.
Second step (any language you choose); copy the ID.
Follow the steps to obtain your API key: click on the menu and select ‘Access IAM’.
Once your API key is generated either copy it somewhere or download the JSON file, otherwise you have to create a new one!
Test your steps and credentials by these first interactions.
# No tool calling
npm run start src/agent.ts <<< "Hello Bee"
# Calls Wikipedia
npm run start src/agent.ts <<< "What is the most visited place in Paris/France?"
Conclusion
These are the very first steps to begin with setup and configuration to set the environment. Stay tuned for the following steps (of my learning path)! 👨🎓
Useful links
- Get to know AI agents: https://developer.ibm.com/articles/awb-ai-agents-introduction/?mhsrc=ibmsearch_a&mhq=bee%20agent
- Building LLM Agent with IBM Bee Agent Framework: https://www.youtube.com/watch?v=C-pZXA6Te_o
- Bee framework: https://github.com/i-am-bee
Top comments (1)
Absolutely incredible! 😻.