This example demonstrates how to enable structured outputs when chatting with an agent. As a comparison, please also refer to the original example from Ollama.
To learn more about structured outputs, see OpenAI's Structured Outputs.
- Install
coagent
(see Installation). - Start a NATS server (see Distributed).
ollama run llama3.1
Run the agent as a script:
python examples/structured-outputs/local_agent.py
Run the agent as a daemon:
python examples/structured-outputs/daemon_agent.py
Then communicate with the agent using the coagent
CLI:
coagent structured -H type:StructuredOutput --chat -d '{
"input": {
"role": "user",
"content": "I have two friends. The first is Ollama 22 years old busy saving the world, and the second is Alonso 23 years old and wants to hang out. Return a list of friends in JSON format"
},
"output_schema": {
"type": "json_schema",
"json_schema": {
"name": "FriendList",
"strict": true,
"schema": {
"type": "object",
"properties": {
"friends": {
"items": {
"$ref": "#/$defs/FriendInfo"
},
"type": "array"
}
},
"required": [
"friends"
],
"$defs": {
"FriendInfo": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"age": {
"type": "integer"
},
"is_available": {
"type": "boolean"
}
},
"required": [
"name",
"age",
"is_available"
]
}
}
}
}
}
}'