from agno.agent import Agent, RunOutput # noqa from agno.models.ollama import Ollama from ollama import Client as OllamaClient agent = Agent( model=Ollama(id="llama3.1:8b", client=OllamaClient()), markdown=True, ) # Print the response in the terminal agent.print_response("Share a 2 sentence horror story")
Create a virtual environment
Terminal
python3 -m venv .venv source .venv/bin/activate
Install Ollama
ollama pull llama3.1:8b
Install libraries
pip install -U ollama agno
Run Agent
python cookbook/models/ollama/set_client.py