Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add geekai to llm provider #3774

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions core/llm/autodetect.ts
Original file line number Diff line number Diff line change
@@ -65,6 +65,7 @@ const PROVIDER_SUPPORTS_IMAGES: string[] = [
"sagemaker",
"continue-proxy",
"openrouter",
"geekai",
"vertexai",
"azure",
"scaleway",
18 changes: 18 additions & 0 deletions core/llm/llms/GeekAI.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
import { LLMOptions } from "../../index.js";
import { osModelsEditPrompt } from "../templates/edit.js";

import OpenAI from "./OpenAI.js";

class GeekAI extends OpenAI {
static providerName = "geekai";
static defaultOptions: Partial<LLMOptions> = {
apiBase: "https://geekai.dev/api/v1/",
model: "gpt-4o-mini",
promptTemplates: {
edit: osModelsEditPrompt,
},
useLegacyCompletionsEndpoint: false,
};
}

export default GeekAI;
2 changes: 2 additions & 0 deletions core/llm/llms/index.ts
Original file line number Diff line number Diff line change
@@ -22,6 +22,7 @@ import Fireworks from "./Fireworks";
import Flowise from "./Flowise";
import FreeTrial from "./FreeTrial";
import FunctionNetwork from "./FunctionNetwork";
import GeekAI from "./GeekAI";
import Gemini from "./Gemini";
import Groq from "./Groq";
import HuggingFaceInferenceAPI from "./HuggingFaceInferenceAPI";
@@ -88,6 +89,7 @@ export const LLMClasses = [
Azure,
WatsonX,
OpenRouter,
GeekAI,
Nvidia,
Vllm,
SambaNova,
21 changes: 21 additions & 0 deletions docs/docs/customize/model-providers/more/geekai.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# GeekAI

GeekAI is a unified interface for commercial and open-source models, giving you access to the best models at the lower prices than offically. You can sign up [here](https://geekai.dev/login), create your API key on the [keys page](https://geekai.dev/user/api_keys), and then choose a model from the [list of supported models](https://geekai.dev/models).

Change `~/.continue/config.json` to look like the following.

```json title="config.json"
{
"models": [
{
"title": "Claude 3.5 Sonnet",
"provider": "geekai",
"model": "claude-3-5-sonnet-latest",
"apiBase": "https://geekai.dev/api/v1",
"apiKey": "..."
}
]
}
```

Learn more about available settings [here](https://geekai.dev/docs/api).
4 changes: 4 additions & 0 deletions docs/docusaurus.config.js
Original file line number Diff line number Diff line change
@@ -373,6 +373,10 @@ const config = {
to: "/customize/model-providers/more/openrouter",
from: "/reference/Model Providers/openrouter",
},
{
to: "/customize/model-providers/more/geekai",
from: "/reference/Model Providers/geekai",
},
{
to: "/customize/model-providers/more/replicatellm",
from: "/reference/Model Providers/replicatellm",
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# GeekAI

GeekAI 是一个多块好省的 AI 模型评测与代理平台,可以自动为你的应用场景调度最便宜可用的 AI 模型。你可以在 [这里](https://geekai.dev/login) 注册,在 [令牌管理页面](https://geekai.dev/user/api_keys) 创建你的 API key ,然后从 [模型广场](https://geekai.dev/models) 中选择一个模型(通过模型名称)。

修改 `~/.continue/config.json` 如下:

```json title="config.json"
{
"models": [
{
"title": "Claude 3.5 Sonnet",
"provider": "geekai",
"model": "claude-3-5-sonnet-latest",
"apiBase": "https://geekai.dev/api/v1",
"apiKey": "..."
}
]
}
```

你可以在 [API文档](https://geekai.dev/docs/api) 了解更多可用的设置。
780 changes: 597 additions & 183 deletions extensions/vscode/config_schema.json

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions packages/openai-adapters/README.md
Original file line number Diff line number Diff line change
@@ -37,6 +37,7 @@ They are concerned with:
- [x] Deepseek
- [ ] Flowise
- [x] Function Network
- [x] GeekAI
- [x] Gemini
- [x] Groq
- [ ] HuggingFace Inference API
2 changes: 2 additions & 0 deletions packages/openai-adapters/src/index.ts
Original file line number Diff line number Diff line change
@@ -59,6 +59,8 @@ export function constructLlmApi(config: LLMConfig): BaseLlmApi | undefined {
return openAICompatible("http://127.0.0.1:5000/v1/", config);
case "openrouter":
return openAICompatible("https://openrouter.ai/api/v1/", config);
case "geekai":
return openAICompatible("https://geekai.dev/api/v1/", config);
case "cerebras":
return openAICompatible("https://api.cerebras.ai/v1/", config);
case "kindo":
1 change: 1 addition & 0 deletions packages/openai-adapters/src/types.ts
Original file line number Diff line number Diff line change
@@ -50,6 +50,7 @@ export const OpenAIConfigSchema = BasePlusConfig.extend({
z.literal("kindo"),
z.literal("msty"),
z.literal("openrouter"),
z.literal("geekai"),
z.literal("sambanova"),
z.literal("text-gen-webui"),
z.literal("vllm"),

Unchanged files with check annotations Beta

import { DEFAULT_AUTOCOMPLETE_OPTS } from "../util/parameters.js";
import { shouldCompleteMultiline } from "./classification/shouldCompleteMultiline.js";
import { ContextRetrievalService } from "./context/ContextRetrievalService.js";

Check warning on line 8 in core/autocomplete/CompletionProvider.ts

GitHub Actions / core-checks

There should be no empty line within import group
import { BracketMatchingService } from "./filtering/BracketMatchingService.js";
import { CompletionStreamer } from "./generation/CompletionStreamer.js";
// Save to cache
if (!outcome.cacheHit && helper.options.useCache) {
(await this.autocompleteCache).put(outcome.prefix, outcome.completion);

Check warning on line 261 in core/autocomplete/CompletionProvider.ts

GitHub Actions / core-checks

Promises must be awaited, end with a call to .catch, end with a call to .then with a rejection handler or be explicitly marked as ignored with the `void` operator
}
// When using the JetBrains extension, Mark as displayed
name: "Python",
// """"#" is for .ipynb files, where we add '"""' surrounding markdown blocks.
// This stops the model from trying to complete the start of a new markdown block
topLevelKeywords: ["def", "class", '"""#'],

Check warning on line 31 in core/autocomplete/constants/AutocompleteLanguageInfo.ts

GitHub Actions / core-checks

Strings must use doublequote
singleLineComment: "#",
endOfLine: [],
};
constructor(private readonly ide: IDE) {
ide.onDidChangeActiveTextEditor((filepath) => {
this.cache.initKey(filepath);

Check warning on line 24 in core/autocomplete/context/ImportDefinitionsService.ts

GitHub Actions / core-checks

Promises must be awaited, end with a call to .catch, end with a call to .then with a rejection handler or be explicitly marked as ignored with the `void` operator
});
}
import { jest } from "@jest/globals";

Check warning on line 1 in core/autocomplete/context/root-path-context/test/testUtils.ts

GitHub Actions / core-checks

There should be at least one empty line between import groups

Check warning on line 1 in core/autocomplete/context/root-path-context/test/testUtils.ts

GitHub Actions / core-checks

`@jest/globals` import should occur after import of `path`
import fs from "fs";
import path from "path";
import Parser from "web-tree-sitter";

Check warning on line 5 in core/autocomplete/context/root-path-context/test/testUtils.ts

GitHub Actions / core-checks

There should be at least one empty line between import groups
import { Position } from "../../../..";
import { testIde } from "../../../../test/fixtures";
import { getAst, getTreePathAtCursor } from "../../../util/ast";
`,
llmOutput: `world!");
`,
expectedCompletion: 'world!");',

Check warning on line 66 in core/autocomplete/filtering/test/testCases.ts

GitHub Actions / core-checks

Strings must use doublequote
},
{
description: "Should autocomplete Java when inside a block",
): Promise<AutocompleteCodeSnippet[]> {
const snippets: AutocompleteCodeSnippet[] = [];
`,
expectedCompletion: `console.log('TEST');`,

Check warning on line 257 in core/autocomplete/filtering/test/testCases.ts

GitHub Actions / core-checks

Strings must use doublequote
},
{
description: "Should autocomplete React effect hook",
return this;
}
`,
llmOutput: ` this;`,

Check warning on line 303 in core/autocomplete/filtering/test/testCases.ts

GitHub Actions / core-checks

Strings must use doublequote
expectedCompletion: `this;`,
},
{
import { EXTENSION_NAME } from "core/control-plane/env";
import { GetGhTokenArgs } from "core/protocol/ide";
import { editConfigJson, getConfigJsonPath } from "core/util/paths";
import * as vscode from "vscode";

Check warning on line 8 in extensions/vscode/src/VsCodeIde.ts

GitHub Actions / vscode-checks

There should be no empty line within import group
import * as URI from "uri-js";

Check warning on line 10 in extensions/vscode/src/VsCodeIde.ts

GitHub Actions / vscode-checks

There should be at least one empty line between import groups

Check warning on line 10 in extensions/vscode/src/VsCodeIde.ts

GitHub Actions / vscode-checks

`uri-js` import should occur before import of `vscode`
import { executeGotoProvider } from "./autocomplete/lsp";
import { Repository } from "./otherExtensions/git";
import { VsCodeIdeUtils } from "./util/ideUtils";
import { getExtensionUri, openEditorAndRevealRange } from "./util/vscode";
import { VsCodeWebviewProtocol } from "./webviewProtocol";
import type {

Check warning on line 17 in extensions/vscode/src/VsCodeIde.ts

GitHub Actions / vscode-checks

There should be at least one empty line between import groups
ContinueRcJson,
FileStatsMap,
FileType,
TerminalOptions,
Thread,
} from "core";
import { SecretStorage } from "./stubs/SecretStorage";

Check warning on line 31 in extensions/vscode/src/VsCodeIde.ts

GitHub Actions / vscode-checks

`./stubs/SecretStorage` import should occur before import of `./util/ideUtils`
class VsCodeIde implements IDE {
ideUtils: VsCodeIdeUtils;
stopStatusBarLoading,
} from "./statusBar";
import type { TabAutocompleteModel } from "../util/loadAutocompleteModel";

Check warning on line 24 in extensions/vscode/src/autocomplete/completionProvider.ts

GitHub Actions / vscode-checks

There should be at least one empty line between import groups
import { startLocalOllama } from "core/util/ollamaHelper";

Check warning on line 25 in extensions/vscode/src/autocomplete/completionProvider.ts

GitHub Actions / vscode-checks

There should be at least one empty line between import groups

Check warning on line 25 in extensions/vscode/src/autocomplete/completionProvider.ts

GitHub Actions / vscode-checks

`core/util/ollamaHelper` import should occur before import of `uri-js`
import type { IDE } from "core";
const Diff = require("diff");
);
} else if (val === "Download Ollama") {
vscode.env.openExternal(vscode.Uri.parse("https://ollama.ai/download"));
} else if (val == "Start Ollama") {

Check warning on line 73 in extensions/vscode/src/autocomplete/completionProvider.ts

GitHub Actions / vscode-checks

Expected '===' and instead saw '=='
startLocalOllama(this.ide);
}
});
import * as vscode from "vscode";
import type { IDE, Range, RangeInFile, RangeInFileWithContents } from "core";
import type Parser from "web-tree-sitter";

Check warning on line 11 in extensions/vscode/src/autocomplete/lsp.ts

GitHub Actions / vscode-checks

There should be at least one empty line between import groups
import { GetLspDefinitionsFunction } from "core/autocomplete/types";
import {
AutocompleteCodeSnippet,