Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: integrations references update 7 #25217

Merged
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 18 additions & 4 deletions docs/docs/integrations/chat/llamacpp.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,23 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# ChatLlamaCpp\n",
"\n",
"This notebook provides a quick overview for getting started with chat model intergrated with [llama cpp python](https://github.com/abetlen/llama-cpp-python)."
"# Llama.cpp\n",
"\n",
">[llama.cpp python](https://github.com/abetlen/llama-cpp-python) library is a simple Python bindings for `@ggerganov`\n",
">[llama.cpp](https://github.com/ggerganov/llama.cpp).\n",
">\n",
">This package provides:\n",
">\n",
"> - Low-level access to C API via ctypes interface.\n",
"> - High-level Python API for text completion\n",
"> - `OpenAI`-like API\n",
"> - `LangChain` compatibility\n",
"> - `LlamaIndex` compatibility\n",
"> - OpenAI compatible web server\n",
"> - Local Copilot replacement\n",
"> - Function Calling support\n",
"> - Vision API support\n",
"> - Multiple Models\n"
]
},
{
Expand Down Expand Up @@ -410,7 +424,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.8"
"version": "3.10.12"
}
},
"nbformat": 4,
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/integrations/chat/octoai.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# ChatOctoAI\n",
"# Octo AI\n",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

admittedly this is inconsistent in the docs, but the template for chat models specifies a "Chat" prefix (see issue here: #22296)

Copy link
Collaborator Author

@leo-gan leo-gan Aug 14, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ccurme Change the template, please.
The integration example page is a document for humans. The page title should not be the names of classes, especially because the examples are not "example per class" pages. Integration examples are about the integrated System/Service/Package not about the integration implementation (hence not about classes/functions). We have API Ref that is exactly about the implementation.
The docs/integrations/ pages for the first-time readers who chose the best integration for the use case or learn how this integration works [in the most base use case].
The API Reference is for developers who work with real projects and need all internal details.

"\n",
"[OctoAI](https://docs.octoai.cloud/docs) offers easy access to efficient compute and enables users to integrate their choice of AI models into applications. The `OctoAI` compute service helps you run, tune, and scale AI applications easily.\n",
"\n",
Expand Down Expand Up @@ -99,7 +99,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.7"
"version": "3.10.12"
},
"vscode": {
"interpreter": {
Expand Down
32 changes: 23 additions & 9 deletions docs/docs/integrations/chat/perplexity.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,9 @@
"id": "bf733a38-db84-4363-89e2-de6735c37230",
"metadata": {},
"source": [
"# ChatPerplexity\n",
"# Perplexity\n",
"\n",
"This notebook covers how to get started with Perplexity chat models."
"This notebook covers how to get started with `Perplexity` chat models."
]
},
{
Expand All @@ -37,17 +37,31 @@
"from langchain_core.prompts import ChatPromptTemplate"
]
},
{
"cell_type": "markdown",
"id": "b26e2035-2f81-4451-ba44-fa2e2d5aeb62",
"metadata": {},
"source": [
"The code provided assumes that your PPLX_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d986aac6-1bae-4608-8514-d3ba5b35b10e",
"metadata": {},
"outputs": [],
"source": [
"chat = ChatPerplexity(\n",
" temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\"\n",
")"
]
},
{
"cell_type": "markdown",
"id": "97a8ce3a",
"metadata": {},
"source": [
"The code provided assumes that your PPLX_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:\n",
"\n",
"```python\n",
"chat = ChatPerplexity(temperature=0, pplx_api_key=\"YOUR_API_KEY\", model=\"llama-3-sonar-small-32k-online\")\n",
"```\n",
"\n",
"You can check a list of available models [here](https://docs.perplexity.ai/docs/model-cards). For reproducibility, we can set the API key dynamically by taking it as an input in this notebook."
]
},
Expand Down Expand Up @@ -221,7 +235,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.18"
"version": "3.10.12"
}
},
"nbformat": 4,
Expand Down
44 changes: 34 additions & 10 deletions docs/docs/integrations/providers/llamacpp.mdx
Original file line number Diff line number Diff line change
@@ -1,26 +1,50 @@
# Llama.cpp

This page covers how to use [llama.cpp](https://github.com/ggerganov/llama.cpp) within LangChain.
It is broken into two parts: installation and setup, and then references to specific Llama-cpp wrappers.
>[llama.cpp python](https://github.com/abetlen/llama-cpp-python) library is a simple Python bindings for `@ggerganov`
>[llama.cpp](https://github.com/ggerganov/llama.cpp).
>
>This package provides:
>
> - Low-level access to C API via ctypes interface.
> - High-level Python API for text completion
> - `OpenAI`-like API
> - `LangChain` compatibility
> - `LlamaIndex` compatibility
> - OpenAI compatible web server
> - Local Copilot replacement
> - Function Calling support
> - Vision API support
> - Multiple Models

## Installation and Setup
- Install the Python package with `pip install llama-cpp-python`

- Install the Python package
```bash
pip install llama-cpp-python
````
- Download one of the [supported models](https://github.com/ggerganov/llama.cpp#description) and convert them to the llama.cpp format per the [instructions](https://github.com/ggerganov/llama.cpp)

## Wrappers

### LLM
## Chat models

See a [usage example](/docs/integrations/chat/llamacpp).

```python
from langchain_community.chat_models import ChatLlamaCpp
```

## LLMs

See a [usage example](/docs/integrations/llms/llamacpp).

There exists a LlamaCpp LLM wrapper, which you can access with
```python
from langchain_community.llms import LlamaCpp
```
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/llms/llamacpp)

### Embeddings
## Embedding models

See a [usage example](/docs/integrations/text_embedding/llamacpp).

There exists a LlamaCpp Embeddings wrapper, which you can access with
```python
from langchain_community.embeddings import LlamaCppEmbeddings
```
For a more detailed walkthrough of this, see [this notebook](/docs/integrations/text_embedding/llamacpp)
21 changes: 21 additions & 0 deletions docs/docs/integrations/providers/maritalk.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# MariTalk

>[MariTalk](https://www.maritaca.ai/en) is an LLM-based chatbot trained to meet the needs of Brazil.

## Installation and Setup

You have to get the MariTalk API key.

You also need to install the `httpx` Python package.

```bash
pip install httpx
```

## Chat models

See a [usage example](/docs/integrations/chat/maritalk).

```python
from langchain_community.chat_models import ChatMaritalk
```
34 changes: 34 additions & 0 deletions docs/docs/integrations/providers/mlx.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# MLX

>[MLX](https://ml-explore.github.io/mlx/build/html/index.html) is a `NumPy`-like array framework
> designed for efficient and flexible machine learning on `Apple` silicon,
> brought to you by `Apple machine learning research`.


## Installation and Setup

Install several Python packages:

```bash
pip install mlx-lm transformers huggingface_hub
````


## Chat models


See a [usage example](/docs/integrations/chat/mlx).

```python
from langchain_community.chat_models.mlx import ChatMLX
```

## LLMs

### MLX Local Pipelines

See a [usage example](/docs/integrations/llms/mlx_pipelines).

```python
from langchain_community.llms.mlx_pipeline import MLXPipeline
```
37 changes: 37 additions & 0 deletions docs/docs/integrations/providers/octoai.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# OctoAI

>[OctoAI](https://docs.octoai.cloud/docs) offers easy access to efficient compute
> and enables users to integrate their choice of AI models into applications.
> The `OctoAI` compute service helps you run, tune, and scale AI applications easily.


## Installation and Setup

- Install the `openai` Python package:
```bash
pip install openai
````
- Register on `OctoAI` and get an API Token from [your OctoAI account page](https://octoai.cloud/settings).


## Chat models

See a [usage example](/docs/integrations/chat/octoai).

```python
from langchain_community.chat_models import ChatOctoAI
```

## LLMs

See a [usage example](/docs/integrations/llms/octoai).

```python
from langchain_community.llms.octoai_endpoint import OctoAIEndpoint
```

## Embedding models

```python
from langchain_community.embeddings.octoai_embeddings import OctoAIEmbeddings
```
25 changes: 25 additions & 0 deletions docs/docs/integrations/providers/perplexity.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Perplexity

>[Perplexity](https://www.perplexity.ai/pro) is the most powerful way to search
> the internet with unlimited Pro Search, upgraded AI models, unlimited file upload,
> image generation, and API credits.
>
> You can check a [list of available models](https://docs.perplexity.ai/docs/model-cards).

## Installation and Setup

Install a Python package:

```bash
pip install openai
````

Get your API key from [here](https://docs.perplexity.ai/docs/getting-started).

## Chat models

See a [usage example](/docs/integrations/chat/perplexity).

```python
from langchain_community.chat_models import ChatPerplexity
```
22 changes: 18 additions & 4 deletions docs/docs/integrations/text_embedding/llamacpp.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,23 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# Llama-cpp\n",
"# Llama.cpp\n",
"\n",
"This notebook goes over how to use Llama-cpp embeddings within LangChain"
">[llama.cpp python](https://github.com/abetlen/llama-cpp-python) library is a simple Python bindings for `@ggerganov`\n",
">[llama.cpp](https://github.com/ggerganov/llama.cpp).\n",
">\n",
">This package provides:\n",
">\n",
"> - Low-level access to C API via ctypes interface.\n",
"> - High-level Python API for text completion\n",
"> - `OpenAI`-like API\n",
"> - `LangChain` compatibility\n",
"> - `LlamaIndex` compatibility\n",
"> - OpenAI compatible web server\n",
"> - Local Copilot replacement\n",
"> - Function Calling support\n",
"> - Vision API support\n",
"> - Multiple Models\n"
]
},
{
Expand Down Expand Up @@ -80,9 +94,9 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 2
"nbformat_minor": 4
}
Loading