Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More flexible matmul test #476

Merged
merged 4 commits into from
Feb 11, 2025
Merged

More flexible matmul test #476

merged 4 commits into from
Feb 11, 2025

Conversation

maxtremblay
Copy link
Collaborator

Prepare the matmul tests to be ready for quantization. However, this currently skip all tests related to quantization as it is not implemented yet. The strategy will be to partially enable quantization tests during development using this method:

https://github.com/tracel-ai/cubecl/blob/c1c333c56461aeea2bba6aac9f72c897ad725d01/crates/cubecl-linalg/src/matmul/tests/test_utils.rs#L148C1-L152C6

Comment on lines +379 to +425
// Perform matmul
for row in 0..m {
for col in 0..n {
for middle in 0..k {
let lhs_index = row * k + middle;
let rhs_index = middle * n + col;
let out_index = row * n + col;

let l = lhs[batch_lhs + lhs_index] as u16;
let r = rhs[batch_rhs + rhs_index] as u16;
let prod = l * r;

out[batch_out + out_index] += prod as i32;
}
}
}

// Substract rhs_zero_offset * sum_rows(lhs)
for row in 0..m {
let mut sum = 0;
for col in 0..k {
sum += lhs[batch_lhs + row * k + col] as i32;
}
sum *= rhs_zero_offset;
for col in 0..n {
out[batch_out + row * n + col] -= sum;
}
}

// Substract lhs_zero_offset * sum_cols(rhs)
for col in 0..n {
let mut sum = 0;
for row in 0..k {
sum += rhs[batch_lhs + row * n + col] as i32;
}
sum *= lhs_zero_offset;
for row in 0..m {
out[batch_out + row * n + col] -= sum;
}
}

// Add final constant term
for row in 0..m {
for col in 0..n {
out[batch_out + row * n + col] += (k as i32) * lhs_zero_offset * rhs_zero_offset;
}
}
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here you can see the flow for the quantized matmul.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another way to test it is to dequantize a tensor, perform matmul, requantize the tensor and see if the quantize matmul works.

@maxtremblay maxtremblay merged commit 47f31fb into main Feb 11, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants