Skip to content

Llama 3.1 Launch capabilities

Latest
Compare
Choose a tag to compare
@nkumaraws nkumaraws released this 19 Aug 21:21
· 67 commits to main since this release
be92104

This is our second release of Llama technical assets. We are focusing on highlighting the new Llama 3.1 model's capabilities including tool/function call, long context window and multi-lingual abilities. We are also releasing recipes on deploying and performing inference with Nvidia's NIM Llama 3.1 deployment. In addition to these, we are also calling out the notebook recipes focusing on synthetic data generation and model distillation with Llama 3.1 405B model (released as part of the Llama 3.1 launch and part of this blog)

More details on Jupyter notebooks with recipes here:

  • Function and tool calling with new Llama 3.1 prompt recipe: This notebook recipe focuses on how to incorporate Amazon Bedrock's ConverseAPI which supports Llama 3.1. The notebook highlights (a) how to go back and forth between messages API format and Llama 3.1's prompt format, and (b) how to utilize function calling with JSON based and custom tool calling in Llama 3.1 on ConverseAPI.
  • Multi-lingual recipe notebook: This notebook creates a cookie-cutter solution for developing multilingual solutions with Llama 3.1. Eight languages are officially supported on Llama 3.1 including English, German, French, Italian, Portuguese, Hindi, Spanish, and Thai.
  • Long context window: The long context window of Llama 3.1 models make it best suited for interacting with long documents. This notebooks provides a step-by-step approach of answering questions in a large document with Langchain and comparing it Llama 3 approach, where the context window was only 8k tokens.

More releases and demos coming soon!!!! Including:.

  • Advanced Tool and Function calling with llama 3.1
  • Fine tuning recipe updates with Llama 3.1
  • RAG implementations with long context window
  • Distributed inference on multiple nodes for Llama 3.1 models on sagemaker
    And More...