- Notifications
You must be signed in to change notification settings - Fork 5.9k
Quantize slice op #37630
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Quantize slice op #37630
Conversation
| Thanks for your contribution! |
wozna left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
| @lidanqing-intel @sfraczek please review |
lidanqing-vv left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
| Hi, @baoachun Do you think this PR could be approved? This PR added new attributes with asExtra. Slice op is op without weights and its quantization is easier. |
XieYunshen left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM for set_tests_properties(test_analyzer_ernie_int8 PROPERTIES TIMEOUT 120)
| } | ||
| | ||
| static const std::initializer_list<std::string> variable_names_slice = { | ||
| "a", "w1", "b", "c", "d", "e", "f"}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it seems you are not using 3 variables: w1,e,f
paddle/fluid/operators/slice_op.cc Outdated
| "(string, default \"float32\"). Data type of mkldnn kernel") | ||
| .SetDefault("float32") | ||
| .InEnum({"float32", "bfloat16"}) | ||
| .InEnum({"float32", "bfloat16", "int8"}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| GET_IR_NODE_FROM_SUBGRAPH(next_op, next_op, slice_pattern); | ||
| | ||
| // skip if prev op and next op is not quantized | ||
| if (!(IsOpDequantized(prev_op)) && !(IsOpQuantized(next_op))) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think one parenthesis would be enough but it's okay to have them too if you prefer.
748a37d
sfraczek left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
wozna left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
| @XieYunshen Could you please review again? |
lidanqing-vv left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
| @Aganlengzi Hi, could you please help approve this PR, this PR does not add any new attribute, just added one possible value for existing attribute: "mkldnn_data_type". It was been approved before by Xieyunshen #37630 (review). But cause there is one more reviewe so developer updated code, so it need approval again. Later will avoid similar situations. |
| LGTM |
| @Aganlengzi Hi, sorry but this PR need XieYunshen's approval, you can see this in log. Thanks. |
| @XieYunshen |
XieYunshen left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM for set_tests_properties(test_analyzer_ernie_int8 PROPERTIES TIMEOUT 120)
* quantize slice op * correct test * fix code formatting

PR types
New features
PR changes
OPs
Describe
Quantization of slice operator.
In Ernie model (
test_quant2_int8_ernie_mkldnn) quantized 1 sliced op, what caused drop in latency by 0.595%.