Skip to content

Commit

Permalink
updated demo description and typo fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
WiolaGreen committed Jul 10, 2024
1 parent f8b4268 commit fd2c0de
Show file tree
Hide file tree
Showing 4 changed files with 11 additions and 22 deletions.
4 changes: 2 additions & 2 deletions docs/_docs/02-how-to-setup-llm-proxy-project.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@ This section contains information about the AI providers in use. Once you have a
1. **AI Providers (Select Provider)**: This functionality allows users to choose from a list of available AI providers to integrate into their project. The dropdown list typically contains a variety of AI providers offering different services or functionalities.
2. **Deployment name**: The deployment name is a unique identifier for the deployment of your AI model or service within the project. It is used to create a proxy URL for a project.
3. **Api base URL**: The API base URL is the root URL for the API endpoints provided by your AI service or model. It serves as the starting point for accessing various functionalities and resources offered by the AI provider.
4. **Proxy URL**: The proxy URL serves as an intermediary between the project and the AI provider's API. It is generated automatically based on the deployment name and project slug. After filled in neccsesary information copy this link to used it
4. **Proxy URL**: The proxy URL serves as an intermediary between the project and the AI provider's API. It is generated automatically based on the deployment name and project slug. After filling in the necessary information copy this link to use it.

After entering the necessary information, click Add AI Provider to add it to the project. **You need to add at least one Ai Provider**
After entering the necessary information, click Add AI Provider to add it to the project. **You need to add at least one AI Provider**

## Complete Project Creation

Expand Down
23 changes: 6 additions & 17 deletions docs/_docs/03-demo.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,29 +10,18 @@ toc_sticky: true
---


## Prompt Sail demo
## Prompt Sail Demo


Experience the functionalities of Prompt Sail firsthand by utilizing our [**demo**](https://try-promptsail.azurewebsites.net/).
The Demo provides an insight into Prompt Sail's user interface. It allows you to explore the layout of the organization's and project's dashboards. You can also find out what details of stored transactions (AI API calls) are logged and what statistics are provided for transactions, projects and organization.

Experience the functionalities of Prompt Sail firsthand by utilizing our [**FREE DEMO**](https://try-promptsail.azurewebsites.net/).
{: .notice--warning}

**No installation or account creation is required, making it a straightforward process.**

In the Demo, you also have the opportunity to test the logging of your LLM API calls. The easiest way to proceed with this is to follow the instructions in the section [Make your first LLM API call](https://promptsail.github.io/prompt_sail/docs/quick-start-guide/#make-your-first-api-call)

The demo provides an insight into the user interface of Prompt Sail, and its architecture, and allows to test the capture of the transactions (Gen AI model calls).

To logg your AI API call in the demo:

1. Visit [**demo page**](https://try-promptsail.azurewebsites.net/).
2. Access the list of projects by clicking the "Get Started" button. (No account creation or login is required.)
3. Select an existing project or create a new one.
4. Obtain the proxy URL required to initiate the logging of calls to Gen AI models.
1. In the Project Dashboard, navigate to the AI Providers tab. Select an available deployment. or create your own if the required provider is not listed.
2. Copy the auto-generated proxy URL and integrate it into your script/application that communicates with the chosen Gen AI model.
5. Execute your script/application with the proxy URL and return to the Prompt Sail demo.
6. In the Project Dashboard, under the Transactions tab, you will see a table containing transactions, including the one from your AI API call.
7. By clicking on the transaction ID, you will go to its details.
8. Whereas the Overview tab provides a comprehensive view of all transaction statistics for the project.

**Warning**: The Prompt Sail demo is public, so avoid including any sensitive information in your prompts when testing the demo. Each new deployment of [**demo page**](https://try-promptsail.azurewebsites.net/) resets the database, causing projects and transactions in the demo to be regularly deleted.
**Warning**: The Prompt Sail demo is public, so avoid including any sensitive information in your prompts when testing the Demo. Each new deployment of [**Demo page**](https://try-promptsail.azurewebsites.net/) resets the database, causing projects and transactions in the demo to be regularly deleted.
{: .notice--warning}
4 changes: 2 additions & 2 deletions docs/_docs/10-Input-output-proxy.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,10 @@ toc_sticky: true

Prompt Sail stores transactions by acting as a proxy for libraries and capturing the request and response data.

All the magic happens when you replace **api_base**(or similar parameter) which originaly points to your LLM provider endpoint by ours **proxy_url**. Thanks to this substitution we can bypass your request and grab response transparently.
All the magic happens when you replace **api_base**(or a similar parameter) which originally points to your LLM provider endpoint by our **proxy_url**. Thanks to this substitution we can bypass your request and grab a response transparently.


Before you start using Prompt Sail as a proxy, you need to configure the `project` and add `ai-providers` via UI, those information eventaully will be used to create your unique **proxy_url**.
Before you start using Prompt Sail as a proxy, you need to configure the `project` and add `ai-providers` via UI, those information eventually will be used to create your unique **proxy_url**.

In one project you can have multiple AI deployments, each with its own **proxy_url**.

Expand Down
2 changes: 1 addition & 1 deletion docs/_docs/21-Organization-Dashboard.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Organization Dashboard"
permalink: /docs/organization-dashboard
permalink: /docs/organization-dashboard/
excerpt: "The Organization Dashboard page serves as a central hub for managing projects within the organization"
last_modified_at: 2024-06-03T14:06:00+01:00
redirect_from:
Expand Down

0 comments on commit fd2c0de

Please sign in to comment.