Skip to content

Commit

Permalink
更新 byzerllm 依赖版本至 0.1.171 并增加 LongContextRAG 的 token 统计输出。修改了 require…
Browse files Browse the repository at this point in the history
…ments.txt 中的 byzerllm 版本号,并在 LongContextRAG 类中增加了 token 统计的 yield 输出。同时更新了版本号至 0.1.282。

auto_coder_000000001857_chat_action.yml_f052739084f2d9a868e866c056cf8d21
  • Loading branch information
allwefantasy committed Mar 6, 2025
1 parent dd45704 commit 8402555
Show file tree
Hide file tree
Showing 3 changed files with 8 additions and 2 deletions.
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ tokenizers

# camelot-py
# llama_index
byzerllm[saas]>=0.1.170
byzerllm[saas]>=0.1.171
patch
diff_match_patch
GitPython
Expand Down
6 changes: 6 additions & 0 deletions src/autocoder/rag/long_context_rag.py
Original file line number Diff line number Diff line change
Expand Up @@ -788,6 +788,12 @@ def generate_sream():
tokens=request_tokens
)
))

yield ("", SingleOutputMeta(input_tokens_count=rag_stat.recall_stat.total_input_tokens + rag_stat.chunk_stat.total_input_tokens,
generated_tokens_count=rag_stat.recall_stat.total_generated_tokens +
rag_stat.chunk_stat.total_generated_tokens,
reasoning_content="qa_model_thinking"
))

if LLMComputeEngine is not None and not self.args.disable_inference_enhance:
llm_compute_engine = LLMComputeEngine(
Expand Down
2 changes: 1 addition & 1 deletion src/autocoder/version.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "0.1.281"
__version__ = "0.1.282"

0 comments on commit 8402555

Please sign in to comment.