Skip to content

Conversation

@janbernloehr
Copy link
Contributor

This is reopening #1727 by @fthielke to fix the remaining issues to get it merged.

Original Description
In opsets 13 and higher, the axis of the operation is arbitrary and can simply be changed according to the permutation of the Transpose.
In lower opsets, Softmax always coerces its inputs to a 2D tensor, making Transpose operations necessary if the permutation moves axes between the coerced batch and feature dimensions.

In opsets 13 and higher, the axis of the operation is arbitrary and can simply be changed according to the permutation of the Transpose. In lower opsets, Softmax always coerces its inputs to a 2D tensor, making Transpose operations necessary if the permutation moves axes between the coerced batch and feature dimensions. Signed-off-by: fthielke <fthielke@fb3.uni-bremen.de>
@janbernloehr janbernloehr force-pushed the softmax_transpose_opt2 branch 2 times, most recently from c8440da to a2e40fb Compare June 10, 2022 22:08
Signed-off-by: janbernloehr <jan@bernloehrs.de>
@janbernloehr janbernloehr force-pushed the softmax_transpose_opt2 branch from a2e40fb to f512d32 Compare June 10, 2022 22:16
@janbernloehr
Copy link
Contributor Author

@fatcat-z , @TomWildenhain-Microsoft : Maybe you can take another look. pylint issues have been fixed.

Copy link
Collaborator

@fatcat-z fatcat-z left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for your contributions!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants