Skip to content

Commit

Permalink
docs(project): add config-cpu.ini
Browse files Browse the repository at this point in the history
  • Loading branch information
tpoisonooo committed Aug 14, 2024
1 parent 9e67ed6 commit 11f5fb6
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 5 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -145,8 +145,8 @@ The following are the GPU memory requirements for different features, the differ

| Configuration Example | GPU mem Requirements | Description | Verified on Linux |
| :----------------------------------------------: | :------------------: | :-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :---------------------------------------------------------------------------: |
| [config-cpu.ini](./config-cpu.ini) | - | Use siliconcloud API <br/> for text only | ![](https://img.shields.io/badge/x86-passed-blue?style=for-the-badge) |
| [config-2G.ini](./config-2G.ini) | 2GB | Use openai API (such as [kimi](https://kimi.moonshot.cn), [deepseek](https://platform.deepseek.com/usage), [stepfun](https://platform.stepfun.com/) and [siliconcloud](https://siliconflow.cn/)) to search for text only | ![](https://img.shields.io/badge/1660ti%206G-passed-blue?style=for-the-badge) |
| [config-cpu.ini](./config-cpu.ini) | - | Use [siliconcloud](https://siliconflow.cn/) API <br/> for text only | ![](https://img.shields.io/badge/x86-passed-blue?style=for-the-badge) |
| [config-2G.ini](./config-2G.ini) | 2GB | Use openai API (such as [kimi](https://kimi.moonshot.cn), [deepseek](https://platform.deepseek.com/usage) and [stepfun](https://platform.stepfun.com/) to search for text only | ![](https://img.shields.io/badge/1660ti%206G-passed-blue?style=for-the-badge) |
| [config-multimodal.ini](./config-multimodal.ini) | 10GB | Use openai API for LLM, image and text retrieval | ![](https://img.shields.io/badge/3090%2024G-passed-blue?style=for-the-badge) |
| \[Standard Edition\] [config.ini](./config.ini) | 19GB | Local deployment of LLM, single modality | ![](https://img.shields.io/badge/3090%2024G-passed-blue?style=for-the-badge) |
| [config-advanced.ini](./config-advanced.ini) | 80GB | local LLM, anaphora resolution, single modality, practical for WeChat group | ![](https://img.shields.io/badge/A100%2080G-passed-blue?style=for-the-badge) |
Expand Down
4 changes: 2 additions & 2 deletions README_zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -143,8 +143,8 @@ Web 版视频教程见 [BiliBili](https://www.bilibili.com/video/BV1S2421N7mn)

| 配置示例 | 显存需求 | 描述 | Linux 系统已验证设备 |
| :----------------------------------------------: | :------: | :------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :---------------------------------------------------------------------------: |
| [config-cpu.ini](./config-cpu.ini) | - | 用 siliconcloud API <br/>仅检索文本 | ![](https://img.shields.io/badge/x86-passed-blue?style=for-the-badge) |
| [config-2G.ini](./config-2G.ini) | 2GB | 用 openai API(如 [kimi](https://kimi.moonshot.cn)[deepseek](https://platform.deepseek.com/usage)[stepfun](https://platform.stepfun.com/)[siliconcloud](https://siliconflow.cn/))<br/>仅检索文本 | ![](https://img.shields.io/badge/1660ti%206G-passed-blue?style=for-the-badge) |
| [config-cpu.ini](./config-cpu.ini) | - |[siliconcloud](https://siliconflow.cn/) API <br/>仅检索文本 | ![](https://img.shields.io/badge/x86-passed-blue?style=for-the-badge) |
| [config-2G.ini](./config-2G.ini) | 2GB | 用 openai API(如 [kimi](https://kimi.moonshot.cn)[deepseek](https://platform.deepseek.com/usage)[stepfun](https://platform.stepfun.com/))<br/>仅检索文本 | ![](https://img.shields.io/badge/1660ti%206G-passed-blue?style=for-the-badge) |
| [config-multimodal.ini](./config-multimodal.ini) | 10GB | 用 openai API 做 LLM,图文检索 | ![](https://img.shields.io/badge/3090%2024G-passed-blue?style=for-the-badge) |
| 【标准版】[config.ini](./config.ini) | 19GB | 本地部署 LLM,单模态 | ![](https://img.shields.io/badge/3090%2024G-passed-blue?style=for-the-badge) |
| [config-advanced.ini](./config-advanced.ini) | 80GB | 本地 LLM,指代消歧,单模态,微信群实用 | ![](https://img.shields.io/badge/A100%2080G-passed-blue?style=for-the-badge) |
Expand Down
2 changes: 1 addition & 1 deletion config-cpu.ini
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ local_llm_bind_port = 8888
# for internlm, see https://internlm.intern-ai.org.cn/api/document

remote_type = "siliconcloud"
remote_api_key = "="
remote_api_key = ""
# max text length for remote LLM.
# use 128000 for kimi, 192000 for gpt/xi-api, 16000 for deepseek, 128000 for zhipuai, 40000 for internlm2
remote_llm_max_text_length = 40000
Expand Down

0 comments on commit 11f5fb6

Please sign in to comment.