Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

initial version #228

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,7 @@ dist
.DS_Store
packages/fixie/.env

.env
.env

# Ignore .yarn folder in examples folder
packages/examples/**/.yarn/*
14 changes: 14 additions & 0 deletions packages/examples/usage-tracking/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
{
"name": "usage-tracking",
"version": "0.0.1",
"description": "A simple app that reports out token usage for agent conversations.",
"main": "usage-tracking.js",
"type": "module",
"license": "MIT",
"dependencies": {
"axios": "^1.6.5",
"dotenv": "^16.4.1",
"fixie": "^6.4.0",
"gpt-3-encoder": "^1.1.4"
}
}
106 changes: 106 additions & 0 deletions packages/examples/usage-tracking/usage-tracking.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
// Import the Fixie client
import { FixieClient } from 'fixie';
import 'dotenv/config';
import axios from 'axios';
import { encode } from 'gpt-3-encoder';

// Set the API key and Create the Fixie client
const FIXIE_API_KEY = process.env.FIXIE_API_KEY;

// Set our Agent ID that we want to get usage info on
const AGENT_ID = process.env.FIXIE_AGENT_ID;

// const fixieClient = new FixieClient({ apiKey: FIXIE_API_KEY }); // TODO once we have the right method

let numConversations = 0;
let numMessages = 0;
let lenConversations = 0;
let numTokens = 0;

console.log('calling listAgentConversations');
getAgentConversations(AGENT_ID).then((data) => {
numConversations = data.length;

console.log(`Got back ${data.length} conversations`);
data.forEach((element) => {
console.log('------------------------------------------------------------\n');
console.log(`Conversation ID: ${element.id}`);
console.log(`Total conversation turns\t${element.turns.length}`);

getConvoTurns(element);

// We want to get content for any messages that are:
// conversation.role = "user" -> get messages that are messages.kind = "text" and get messages.content for the text of the message
// make sure the state is "done"
// conversation.role = "assistant"
// kind = "text", get the content
// make sure the state is "done"
//
// kind = "functionResponse"....get the response
});

console.log('\n\n============================================================');
console.log(`Final stats for agent ${AGENT_ID}:\n`);
console.log(`Total Conversations\t${numConversations}\n`);
console.log(`Total Agent Messages\t${numMessages}\n`);
console.log(`Total Characters\t${lenConversations}\n`);
console.log(`Total LLM Tokens\t${numTokens}\n`);
console.log('============================================================');
});

function getConvoTurns(conversation) {
let numConvoMessages = 0;
let numConvoChars = 0;
let numConvoTokens = 0;

// Iterate thru the conversation and process all the turns
conversation.turns.forEach((turn) => {
numConvoMessages += turn.messages.length;

// Iterate through the turn messages and log their stats
turn.messages.forEach((message) => {
// console.log(message);
numMessages++;
// Make sure it's a message type that we want
if (turn.state == 'done') {
// Function Response
if (turn.role == 'assistant' && message.kind == 'functionResponse') {
lenConversations += message.response.length;
const encoded = encode(message.response);
benlower marked this conversation as resolved.
Show resolved Hide resolved
numTokens += encoded.length;

numConvoChars += message.response.length;
numConvoTokens += encoded.length;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This ignores the quadratic effect where messages become input tokens to subsequent generations (e.g. a conversation with 10 messages uses >> 10 times the number of input tokens as a conversation with 1 message)

But old messages also get pruned as well as they fill up the context window, so calculating it accurately is even trickier. The easiest thing to get approximately correct is to calculate the number of output tokens, because that's just the assistant text/functionCall messages, but I'm guessing you want an estimate for both :)

} else if (message.kind == 'text' && (turn.role == 'assistant' || turn.role == 'user')) {
lenConversations += message.content.length;
const encoded = encode(message.content);
numTokens += encoded.length;

numConvoChars += message.content.length;
numConvoTokens += encoded.length;
}
}
});
});

console.log(`Total conversation messages\t${numConvoMessages}`);
console.log(`Total conversation characters\t${numConvoChars}`);
console.log(`Total conversation tokens\t${numConvoTokens}`);
}

async function getAgentConversations(agentId) {
try {
const response = await axios({
method: 'get',
maxBodyLength: Infinity,
url: `https://api.fixie.ai/api/v1/agents/${agentId}/conversations`,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this endpoint paginated?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes it is. changed the implementation to handle paging.

headers: {
Authorization: `Bearer ${FIXIE_API_KEY}`,
},
});
// console.log(response.data.conversations);
return response.data.conversations;
} catch (error) {
console.log(error);
}
}
Loading
Loading