Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

aten._log_softmax.default failed with shape [19, 256008] #669

Open
swimdi opened this issue Dec 23, 2024 · 0 comments
Open

aten._log_softmax.default failed with shape [19, 256008] #669

swimdi opened this issue Dec 23, 2024 · 0 comments
Assignees

Comments

@swimdi
Copy link
Contributor

swimdi commented Dec 23, 2024

aten._softmax.default failed with this input variation

aten__log_softmax_default_blocklist = [["Tensor<[19, 256008]> self = ?", "int dim = 1", "bool half_to_float = False"]]

Reproduce step is:

import torch
import ttnn

device = ttnn.open_device(device_id=0)
input = torch.rand([19, 256008])
ttnn_from_torch = ttnn.from_torch(input, device = device, layout = ttnn.TILE_LAYOUT, dtype = ttnn.bfloat16)
ttnn_softmax = ttnn.softmax(ttnn_from_torch, 1, numeric_stable = True)
ttnn_log = ttnn.log(ttnn_softmax)
ttnn_to_torch = ttnn.to_torch(ttnn_log, dtype = torch.bfloat16)

# Statically allocated circular buffers on core range [(x=0,y=0) - (x=7,y=7)] grow to 49277728 B which is beyond max L1 size of 1499136 B
@swimdi swimdi changed the title aten._softmax.default failed with shape [19, 256008] aten._log_softmax.default failed with shape [19, 256008] Dec 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: No status
Development

No branches or pull requests

2 participants