From 3d29af7c02ecf8675cd68ffb5fc0ba0c00f0d479 Mon Sep 17 00:00:00 2001 From: Esha Lakhotia <72316944+eshalakhotia@users.noreply.github.com> Date: Tue, 5 Sep 2023 11:50:22 -0700 Subject: [PATCH] Update README.md --- README.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/README.md b/README.md index 762a79a8..89274de1 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ Transformers Neuron for Trn1 and Inf2 is a software package that enables PyTorch users to perform large language model (LLM) inference on -second-generation Neuron hardware (See: [NeuronCore-v2](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/neuron-core-v2.html)). +second-generation Neuron hardware (See: [NeuronCore-v2](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/arch/neuron-hardware/neuron-core-v2.html)). The [Neuron performance page](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/benchmarks/inf2/inf2-performance.html#large-language-models-inference-performance) lists expected inference performance for commonly used Large Language Models. # Transformers Neuron (``transformers-neuronx``) Documentation Please refer to the [Transformers Neuron documentation](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/libraries/transformers-neuronx/) for setup and developer guides. @@ -29,8 +29,8 @@ Please refer to the [transformers-neuronx release notes](https://awsdocs-neuron. # Troubleshooting -Please refer to our [Contact -Us](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/contact.html) +Please refer to our [Support +Page](https://awsdocs-neuron.readthedocs-hosted.com/en/latest/general/support.html) page for additional information and support resources. If you intend to file a ticket and you can share your model artifacts, please re-run your failing script with ``NEURONX_DUMP_TO=./some_dir``. This will dump