Replies: 2 comments 3 replies
-
|
这个特性应该会和算子版本紧密相关,我们建议开发者在这方便需要增加判断逻辑,做好防呆。 |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
@wwens7 对应的MR有什么进展不? |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
目前
torch_npu里torchair中的实现如下https://gitee.com/ascend/torchair/blob/master/python/torchair/experimental/inference.py#L12
我们可以参考这部分实现,对支持nz format的op进行无脑转化。
nz本质是做片下padding,如果我们能在编译阶段完成,实际上就是做了一个离线padding,这部分一般来说,总是有收益的。算子是否支持和具体的收益曲线,还需要与算子开发者进行一一确认。
torch_npu支持nz format的op列表请参考,高亮搜索internal_format_opapi字段即可查看支持的版本。https://gitee.com/ascend/op-plugin/blob/7.0.0/op_plugin/config/op_plugin_functions.yaml

Beta Was this translation helpful? Give feedback.
All reactions