We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I follow the steps by steps for rocm-benchmark. The last step run failed. The log is bellow.
$python -m mlc_chat.cli.benchmark --model ${PATH_TEST}/params --device "rocm:0" --prompt "What is the meaning of life?" --generate-length 256
The key words of error log is "[2024-01-17 08:53:37] ERROR model_metadata.py:93: FAILED to read metadata section in legacy model lib."
The text was updated successfully, but these errors were encountered:
Any updates on this? I have a system with 3x w7900 + 1xw6800 and it works fine with llama.cpp hoping to get 4x w7900 working soon and try that in mlc
Sorry, something went wrong.
No branches or pull requests
I follow the steps by steps for rocm-benchmark. The last step run failed. The log is bellow.
The key words of error log is "[2024-01-17 08:53:37] ERROR model_metadata.py:93: FAILED to read metadata section in legacy model lib."
The text was updated successfully, but these errors were encountered: