Wanted to play around with agents. Decided to go for one of the more popular frameworks, crewai.
Stubborn decided to install it local. Took me an afternoon bug fixing and library checking to get all up and working.
Findings:
CrewAI runs on LiteLLM, this interacts easier with ollama as my preferred LM studio. Running now also Ollama, maxing out my laptop harddisk by having now some models double (luckily apple is so cheap with storage and memory :-D)
Got down the rabbit hole in debugging, expected the issue to be in LiteLLM. Got strange errors on that one related to a module on calculation which didn't know the model name. Switched to one of the standard models, but only solved part of the problem.
Faced the biggest challenge with an error of LiteLLM:ERROR: litellm_logging.py RuntimeError: can't register atexit after shutdown . First tried to vibe code and vibe ask myself out of this, but unfortunately my models are not strong enough to have the mighty answer. Went back to 1999 and used good old stack overflow. Found here the answer: https://stackoverflow.com/questions/65467329/server-in-a-thread-python3-9-0aiohttp-runtimeerror-cant-register-atexit-a
not really related to the python version is was running nor liteLLM, but adding the
# Added these two imports
import concurrent.futures.thread
import concurrent.futures.process
# and this is just the standard code from the liteLLM side
from litellm import completion
print("start test")
response = completion(
model="ollama/llama2",
messages = [{ "content": "Hello, how are you?","role": "user"}],
api_base="http://127.0.0.1:11434"
)
print(response.object)
print("done test")
Going back to crewai, it worked again. Learned again a lot of stuff on vent's, python uninstall and installs.
To be continued for the crewai learnings
Top comments (0)