Skip to content

Commit

Permalink
Added OpenAI function calls
Browse files Browse the repository at this point in the history
  • Loading branch information
TheR1D committed Jan 8, 2024
1 parent 4b670cf commit f728b03
Show file tree
Hide file tree
Showing 14 changed files with 351 additions and 35 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/lint_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ jobs:
- name: ruff
run: ruff sgpt tests scripts
- name: mypy
run: mypy sgpt
run: mypy sgpt --exclude function.py --exclude handler.py --exclude default_functions
- name: unittests
run: |
export OPENAI_API_KEY=test_api_key
Expand Down
115 changes: 87 additions & 28 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -277,6 +277,57 @@ It is a Python script that uses the random module to generate and print a random
```
It is also possible to pickup conversations from chat sessions (which were created using `--chat` option) and continue them in REPL mode.
### Function calling
[Function calls](https://platform.openai.com/docs/guides/function-calling) is a powerful feature OpenAI provides, ShellGPT has a convenient way to define functions and use them. In order to create your custom function, navigate to `~/.config/shell_gpt/functions` and create a new .py file with the function name. Inside this file, you can define your function using the following syntax:
```python
# execute_shell_command.py
import subprocess
from pydantic import Field
from instructor import OpenAISchema
class Function(OpenAISchema):
"""
Executes a shell command and returns the output (result).
"""
shell_command: str = Field(..., example="ls -la", descriptions="Shell command to execute.")
class Config:
title = "execute_shell_command"
@classmethod
def execute(cls, shell_command: str) -> str:
result = subprocess.run(shell_command.split(), capture_output=True, text=True)
return f"Exit code: {result.returncode}, Output:\n{result.stdout}"
```
The docstring comment inside the class will be passed to OpenAI API as a description for the function, along with the `title` attribute and parameters descriptions. The `execute` function will be called if LLM decides to use your function. In this case we are allowing LLM to execute any Shell commands in our system, which is not a good idea :). Since we are returning the output of the command, LLM will be able to analyze it and decide if it is a good fit for the prompt. Here is an example how the function might be executed by LLM:
```shell
sgpt "What are the files in /tmp folder?"
# -> @FunctionCall execute_shell_command(shell_command="ls /tmp")
# -> The /tmp folder contains the following files and directories:
# -> test.txt
# -> test.json
# ...
```
Note that if for some reason the function (execute_shell_command) will return an error, LLM might try to accomplish the task based on the output. Let's say we don't have installed `jq` in our system, and we ask LLM to parse JSON file:
```shell
sgpt "parse /tmp/test.json file using jq and return only email value"
# -> @FunctionCall execute_shell_command(shell_command="jq -r '.email' /tmp/test.json")
# -> It appears that jq is not installed on the system. Let me try to install it using brew.
# -> @FunctionCall execute_shell_command(shell_command="brew install jq")
# -> jq has been successfully installed. Let me try to parse the file again.
# -> @FunctionCall execute_shell_command(shell_command="jq -r '.email' /tmp/test.json")
# -> The email value in /tmp/test.json is johndoe@example.
```
It is also possible to chain multiple function calls in the prompt:
```shell
sgpt "Play music and open hacker news"
# -> @FunctionCall play_music()
# -> @FunctionCall open_url(url="https://news.ycombinator.com")
# -> Music is now playing, and Hacker News has been opened in your browser. Enjoy!
```
This is just a simple example of how you can use function calls. It is truly a powerful feature that can be used to accomplish a variety of complex tasks like `sgpt "order my favorite pizza`. We have dedicated [category](https://github.com/TheR1D/shell_gpt/discussions/categories/functions) in GitHub Discussions for sharing and discussing functions. LLM might execute destructive commands, so please use it at your own risk❗️
### Roles
ShellGPT allows you to create custom roles, which can be utilized to generate code, shell commands, or to fulfill your specific needs. To create a new role, use the `--create-role` option followed by the role name. You will be prompted to provide a description for the role, along with other details. This will create a JSON file in `~/.config/shell_gpt/roles` with the role name. Inside this directory, you can also edit default `sgpt` roles, such as **shell**, **code**, and **default**. Use the `--list-roles` option to list all available roles, and the `--show-role` option to display the details of a specific role. Here's an example of a custom role:
```shell
Expand Down Expand Up @@ -333,40 +384,48 @@ DEFAULT_EXECUTE_SHELL_CMD=false
DISABLE_STREAMING=false
# The pygment theme to view markdown (default/describe role).
CODE_THEME=default
# Path to a directory with functions.
OPENAI_FUNCTIONS_PATH=/Users/user/.config/shell_gpt/functions
# Print output of functions when LLM uses them.
SHOW_FUNCTIONS_OUTPUT=false
# Allows LLM to use functions.
OPENAI_USE_FUNCTIONS=true
```
Possible options for `DEFAULT_COLOR`: black, red, green, yellow, blue, magenta, cyan, white, bright_black, bright_red, bright_green, bright_yellow, bright_blue, bright_magenta, bright_cyan, bright_white.
Possible options for `CODE_THEME`: https://pygments.org/styles/
### Full list of arguments
```text
╭─ Arguments ─────────────────────────────────────────────────────────────────────────────────────────────────╮
│ prompt [PROMPT] The prompt to generate completions for. │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ───────────────────────────────────────────────────────────────────────────────────────────────────╮
│ --model TEXT OpenAI GPT model to use. [default: gpt-3.5-turbo] │
│ --temperature FLOAT RANGE [0.0<=x<=2.0] Randomness of generated output. [default: 0.0] │
│ --top-probability FLOAT RANGE [0.0<=x<=1.0] Limits highest probable tokens (words). [default: 1.0] │
│ --editor Open $EDITOR to provide a prompt. [default: no-editor] │
│ --cache Cache completion results. [default: cache] │
│ --help Show this message and exit. │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Assistance Options ────────────────────────────────────────────────────────────────────────────────────────╮
│ --shell -s Generate and execute shell commands. │
│ --describe-shell -d Describe a shell command. │
│ --code --no-code Generate only code. [default: no-code] │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Chat Options ──────────────────────────────────────────────────────────────────────────────────────────────╮
│ --chat TEXT Follow conversation with id, use "temp" for quick session. [default: None] │
│ --repl TEXT Start a REPL (Read–eval–print loop) session. [default: None] │
│ --show-chat TEXT Show all messages from provided chat id. [default: None] │
│ --list-chats List all existing chat ids. [default: no-list-chats] │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Role Options ──────────────────────────────────────────────────────────────────────────────────────────────╮
│ --role TEXT System role for GPT model. [default: None] │
│ --create-role TEXT Create role. [default: None] │
│ --show-role TEXT Show role. [default: None] │
│ --list-roles List roles. [default: no-list-roles] │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Arguments ──────────────────────────────────────────────────────────────────────────────────────────────╮
│ prompt [PROMPT] The prompt to generate completions for. │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Options ────────────────────────────────────────────────────────────────────────────────────────────────╮
│ --model TEXT Large language model to use. [default: gpt-4-1106-preview] │
│ --temperature FLOAT RANGE [0.0<=x<=2.0] Randomness of generated output. [default: 0.0] │
│ --top-probability FLOAT RANGE [0.0<=x<=1.0] Limits highest probable tokens (words). [default: 1.0] │
│ --editor Open $EDITOR to provide a prompt. [default: no-editor] │
│ --cache Cache completion results. [default: cache] │
│ --version Show version. │
│ --help Show this message and exit. │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Assistance Options ─────────────────────────────────────────────────────────────────────────────────────╮
│ --shell -s Generate and execute shell commands. │
│ --describe-shell -d Describe a shell command. │
│ --code -c Generate only code. │
│ --functions --no-functions Allow function calls. [default: functions] │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Chat Options ───────────────────────────────────────────────────────────────────────────────────────────╮
│ --chat TEXT Follow conversation with id, use "temp" for quick session. [default: None] │
│ --repl TEXT Start a REPL (Read–eval–print loop) session. [default: None] │
│ --show-chat TEXT Show all messages from provided chat id. [default: None] │
│ --list-chats -lc List all existing chat ids. │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────╯
╭─ Role Options ───────────────────────────────────────────────────────────────────────────────────────────╮
│ --role TEXT System role for GPT model. [default: None] │
│ --create-role TEXT Create role. [default: None] │
│ --show-role TEXT Show role. [default: None] │
│ --list-roles -lr List roles. │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────╯
```
## LocalAI
Expand Down
2 changes: 2 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ dependencies = [
"rich >= 13.1.0, < 14.0.0",
"distro >= 1.8.0, < 2.0.0",
"openai >= 1.6.1, < 2.0.0",
"instructor >= 0.4.5, < 1.0.0",
'pyreadline3 >= 3.4.1, < 4.0.0; sys_platform == "win32"',
]

Expand Down Expand Up @@ -81,6 +82,7 @@ skip = "__init__.py"

[tool.mypy]
strict = true
exclude = ["function.py", "handler.py", "default_functions"]

[tool.ruff]
select = [
Expand Down
2 changes: 1 addition & 1 deletion sgpt/__version__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
__version__ = "1.0.1"
__version__ = "1.0.2"
25 changes: 25 additions & 0 deletions sgpt/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@
from click.types import Choice

from sgpt.config import cfg
from sgpt.default_functions.init_functions import install_functions as inst_funcs
from sgpt.function import get_openai_schemas
from sgpt.handlers.chat_handler import ChatHandler
from sgpt.handlers.default_handler import DefaultHandler
from sgpt.handlers.repl_handler import ReplHandler
Expand Down Expand Up @@ -57,9 +59,16 @@ def main(
),
code: bool = typer.Option(
False,
"--code",
"-c",
help="Generate only code.",
rich_help_panel="Assistance Options",
),
functions: bool = typer.Option(
cfg.get("OPENAI_USE_FUNCTIONS") == "true",
help="Allow function calls.",
rich_help_panel="Assistance Options",
),
editor: bool = typer.Option(
False,
help="Open $EDITOR to provide a prompt.",
Expand Down Expand Up @@ -92,6 +101,8 @@ def main(
),
list_chats: bool = typer.Option(
False,
"--list-chats",
"-lc",
help="List all existing chat ids.",
callback=ChatHandler.list_ids,
rich_help_panel="Chat Options",
Expand All @@ -115,6 +126,8 @@ def main(
),
list_roles: bool = typer.Option(
False,
"--list-roles",
"-lr",
help="List roles.",
callback=SystemRole.list,
rich_help_panel="Role Options",
Expand All @@ -125,6 +138,12 @@ def main(
callback=install_shell_integration,
hidden=True, # Hiding since should be used only once.
),
install_functions: bool = typer.Option(
False,
help="Install default functions.",
callback=inst_funcs,
hidden=True, # Hiding since should be used only once.
),
) -> None:
stdin_passed = not sys.stdin.isatty()

Expand Down Expand Up @@ -154,6 +173,8 @@ def main(
else SystemRole.get(role)
)

function_schemas = (get_openai_schemas() or None) if functions else None

if repl:
# Will be in infinite loop here until user exits with Ctrl+C.
ReplHandler(repl, role_class).handle(
Expand All @@ -163,6 +184,7 @@ def main(
top_p=top_probability,
chat_id=repl,
caching=cache,
functions=function_schemas,
)

if chat:
Expand All @@ -173,6 +195,7 @@ def main(
top_p=top_probability,
chat_id=chat,
caching=cache,
functions=function_schemas,
)
else:
full_completion = DefaultHandler(role_class).handle(
Expand All @@ -181,6 +204,7 @@ def main(
temperature=temperature,
top_p=top_probability,
caching=cache,
functions=function_schemas,
)

while shell and not stdin_passed:
Expand All @@ -201,6 +225,7 @@ def main(
temperature=temperature,
top_p=top_probability,
caching=cache,
functions=function_schemas,
)
continue
break
Expand Down
3 changes: 2 additions & 1 deletion sgpt/cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,8 @@ def wrapper(*args: Any, **kwargs: Any) -> Generator[str, None, None]:
for i in func(*args, **kwargs):
result += i
yield i
cache_file.write_text(result)
if "@FunctionCall" not in result:
cache_file.write_text(result)
self._delete_oldest_files(self.length) # type: ignore

return wrapper
Expand Down
Loading

0 comments on commit f728b03

Please sign in to comment.