Skip to content

Commit

Permalink
Fix breakage with Open AI's llm-chat-token-limit (#77)
Browse files Browse the repository at this point in the history
Also fix unused variable in llm-tester.

Add byte-compilation to the CI to catch issues like this in the future.
  • Loading branch information
ahyatt authored Sep 3, 2024
1 parent 54d6e9a commit e19a678
Show file tree
Hide file tree
Showing 4 changed files with 9 additions and 4 deletions.
4 changes: 4 additions & 0 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,10 @@ jobs:
- name: Check out the source code
uses: actions/checkout@v4

- name: Byte-compile the project
run: |
eldev -dtT compile --warnings-as-errors
- name: Lint the project
run: |
eldev -p -dtT lint
Expand Down
2 changes: 2 additions & 0 deletions NEWS.org
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
* Version 0.17.4
- Fix problem with Open AI's =llm-chat-token-limit=.
* Version 0.17.3
- More fixes with Claude and Ollama function calling conversation, thanks to Paul Nelson.
- Make =llm-chat-streaming-to-point= more efficient, just inserting new text, thanks to Paul Nelson.
Expand Down
4 changes: 2 additions & 2 deletions llm-openai.el
Original file line number Diff line number Diff line change
Expand Up @@ -270,8 +270,8 @@ RESPONSE can be nil if the response is complete."
4096)
(t 4096))))

(cl-defmethod llm-chat-token-limit ((_ llm-openai-compatible))
(llm-provider-utils-model-token-limit (llm-ollama-chat-model provider)))
(cl-defmethod llm-chat-token-limit ((provider llm-openai-compatible))
(llm-provider-utils-model-token-limit (llm-openai-chat-model provider)))

(cl-defmethod llm-capabilities ((_ llm-openai))
(list 'streaming 'embeddings 'function-calls))
Expand Down
3 changes: 1 addition & 2 deletions llm-tester.el
Original file line number Diff line number Diff line change
Expand Up @@ -262,8 +262,7 @@ of by calling the `describe_function' function."

(defun llm-tester-function-calling-sync (provider)
"Test that PROVIDER can call functions."
(let ((prompt (llm-tester-create-test-function-prompt))
(result (llm-chat provider (llm-tester-create-test-function-prompt))))
(let ((result (llm-chat provider (llm-tester-create-test-function-prompt))))
(cond ((stringp result)
(llm-tester-log
"ERROR: Provider %s returned a string instead of a function result"
Expand Down

0 comments on commit e19a678

Please sign in to comment.