-
Notifications
You must be signed in to change notification settings - Fork 123
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Loading status checks…
Add notebook
Showing
2 changed files
with
193 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,192 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"# Hugging Face Pipelines\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"If you're opening this Notebook on colab, you will probably need to install Langchain and 🤗 Optimum. Uncomment the following cell and run it." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"#! pip install langchain-huggingface optimum[ipex]" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Make sure your version of langchain-huggingface is at least v0.2 and 🤗 Optimum is at least v1.22.0 since the functionality was introduced in these versions:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 1, | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"name": "stderr", | ||
"output_type": "stream", | ||
"text": [ | ||
"/home/echarlaix/miniconda3/envs/ipex/lib/python3.10/site-packages/tqdm/auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html\n", | ||
" from .autonotebook import tqdm as notebook_tqdm\n" | ||
] | ||
}, | ||
{ | ||
"name": "stdout", | ||
"output_type": "stream", | ||
"text": [ | ||
"optimum-intel version is 1.22.0.dev0\n" | ||
] | ||
} | ||
], | ||
"source": [ | ||
"from optimum.intel.version import __version__\n", | ||
"\n", | ||
"print(\"optimum-intel version is\", __version__)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": 2, | ||
"metadata": {}, | ||
"outputs": [ | ||
{ | ||
"name": "stdout", | ||
"output_type": "stream", | ||
"text": [ | ||
"langchain-huggingface version is 0.1.2\n" | ||
] | ||
} | ||
], | ||
"source": [ | ||
"from optimum.intel.utils.import_utils import _langchain_hf_version\n", | ||
"\n", | ||
"print(\"langchain-huggingface version is\", _langchain_hf_version)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Model Loading\n", | ||
"\n", | ||
"Models can be loaded by specifying the model parameters using the `from_model_id` method." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from langchain_huggingface.llms import HuggingFacePipeline\n", | ||
"\n", | ||
"hf = HuggingFacePipeline.from_model_id(\n", | ||
" model_id=\"gpt2\",\n", | ||
" task=\"text-generation\",\n", | ||
" pipeline_kwargs={\"max_new_tokens\": 10},\n", | ||
" backend=\"ipex\",\n", | ||
")" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Create Chain\n", | ||
"\n", | ||
"With the model loaded into memory, you can compose it with a prompt to form a chain." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from langchain_core.prompts import PromptTemplate\n", | ||
"\n", | ||
"template = \"\"\"Question: {question}\n", | ||
"\n", | ||
"Answer: Let's think step by step.\"\"\"\n", | ||
"prompt = PromptTemplate.from_template(template)\n", | ||
"\n", | ||
"chain = prompt | hf\n", | ||
"\n", | ||
"question = \"What is electroencephalography?\"\n", | ||
"\n", | ||
"print(chain.invoke({\"question\": question}))\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"To get response without prompt, you can bind skip_prompt=True with LLM." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"chain = prompt | hf.bind(skip_prompt=True)\n", | ||
"\n", | ||
"question = \"What is electroencephalography?\"\n", | ||
"\n", | ||
"print(chain.invoke({\"question\": question}))" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Streaming response :" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"for chunk in chain.stream(question):\n", | ||
" print(chunk, end=\"\", flush=True)" | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"kernelspec": { | ||
"display_name": "Python 3 (ipykernel)", | ||
"language": "python", | ||
"name": "python3" | ||
}, | ||
"language_info": { | ||
"codemirror_mode": { | ||
"name": "ipython", | ||
"version": 3 | ||
}, | ||
"file_extension": ".py", | ||
"mimetype": "text/x-python", | ||
"name": "python", | ||
"nbconvert_exporter": "python", | ||
"pygments_lexer": "ipython3", | ||
"version": "3.10.14" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 4 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters