You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/path/to/c.py", line 18, in <module>
out = jitted(x)
^^^^^^^^^
File "/path/to/thunder/__init__.py", line 830, in wrapped
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/__init__.py", line 870, in fn_
cache_entry, inps, pro_to_epi = get_computation_and_inputs(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/__init__.py", line 809, in wrapped
cache_entry, inps, pro_to_epi = get_computation_and_inputs_fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/langctxs.py", line 136, in _fn
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/__init__.py", line 236, in cache_info_wrapper
res = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/__init__.py", line 774, in get_computation_and_inputs
prologue_trc, computation_trc, epilogue_trc = acquire_initial_trace(fn, args, kwargs, cd, cs, ad_hoc_executor)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/__init__.py", line 434, in acquire_initial_trace
jit_results: TraceResults = thunder_general_jit(
^^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/jit_ext.py", line 2133, in thunder_general_jit
result = jfn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/interpreter.py", line 7567, in fn_
raise e
File "/path/to/thunder/core/interpreter.py", line 7526, in fn_2
return fn(*args, **kwargs)
File "/path/to/c.py", line 7, in f
return t.view(torch.float8_e8m0fnu)
^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/interpreter.py", line 6952, in partial_call_impl
return partial_function.func(*(partial_function.args + args), **(partial_function.keywords | kwargs))
^^^^^^^^^^
File "/path/to/thunder/core/interpreter.py", line 1302, in wrapping_wrapper
res = ufn(*uargs, **ukwargs)
^^^^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/jit_ext.py", line 388, in wrapper
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/symbol.py", line 320, in __call__
result = self.meta(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/langctxs.py", line 136, in _fn
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/torch/__init__.py", line 1398, in view
return reshape(a, shape)
^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/symbol.py", line 320, in __call__
result = self.meta(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/langctxs.py", line 136, in _fn
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/torch/__init__.py", line 1138, in reshape
return clang.reshape(a, shape)
^^^^^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/core/langctxs.py", line 136, in _fn
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/path/to/thunder/clang/__init__.py", line 1069, in reshape
if l >= 0:
^^^^^^
TypeError: '>=' not supported between instances of 'torch.dtype' and 'int'
With thunderfx, thanks to its splitter, it works but it gets fallen back. The module after the split is as follows:
class GraphModule(torch.nn.Module):
def forward(self, l_t_: "i8[4, 4]"):
# No stacktrace found for following nodes
inductor_0 = self.inductor_0(l_t_); l_t_ = None
return (inductor_0,)
class inductor_0(torch.nn.Module):
def forward(self, l_t_: "i8[4, 4]"):
view: "f8e4m3fn[4, 4]" = l_t_.view(torch.float8_e4m3fn); l_t_ = None
return view
class _orig_mod(torch.nn.Module):
def forward(self, l_t_: "i8[4, 4]"):
view: "f8e4m3fn[4, 4]" = l_t_.view(torch.float8_e4m3fn); l_t_ = None
return view
Expected behavior
just works.
The text was updated successfully, but these errors were encountered:
Uh oh!
There was an error while loading. Please reload this page.
🐛 Bug
torch.Tensor.view
with dtype is not supported while it's seen in e.g. torchao'sto_mx
function here -- https://github.com/pytorch/ao/blob/v0.10.0/torchao/prototype/mx_formats/mx_tensor.py#L146-L329.To Reproduce
Code sample
This fails as follows:
With thunderfx, thanks to its splitter, it works but it gets fallen back. The module after the split is as follows:
Expected behavior
just works.
The text was updated successfully, but these errors were encountered: