Skip to content

Conversation

@sfraczek
Copy link

@sfraczek sfraczek commented Feb 3, 2022

PR types

Bug fixes

PR changes

Others

Describe

  • Added code that prevents squashing quantize->dequantize pairs if first is uint8 and second is int8.
  • Added fix for post training quantization checking for unsigned output. FC and Conv2d have different attribute names for fused activation.
@paddle-bot-old
Copy link

paddle-bot-old bot commented Feb 3, 2022

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@sfraczek sfraczek requested review from lidanqing-vv and wozna and removed request for wozna February 3, 2022 14:09
@sfraczek sfraczek requested a review from wozna February 3, 2022 18:01
@lidanqing-vv lidanqing-vv changed the title prevent squashing pair u8 dequantize -> s8 quantize [Bug fix] prevent squashing pair u8 dequantize -> s8 quantize Feb 7, 2022
@sfraczek
Copy link
Author

sfraczek commented Feb 7, 2022

please do not merge yet I want to add unit tests, there seems to be a perf regression

@sfraczek sfraczek force-pushed the patch-concat-int8-input branch from 0450a68 to 2e7192d Compare February 9, 2022 14:25
if (dequant_in->inputs[0]->IsOp()) {
auto prev_op = dequant_in->inputs[0]->Op();
std::string act_name;
std::cout << "prev_op->Type(): " << prev_op->Type() << std::endl;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please don't use std::cout.

@sfraczek
Copy link
Author

I was comparing to wrong baseline. There shouldn't be a problem with this PR.

Copy link
Contributor

@baoachun baoachun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@wozna wozna left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great job, LGTM

@jczaja jczaja self-requested a review February 14, 2022 09:18
Copy link
Contributor

@jczaja jczaja left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jczaja jczaja merged commit 66b5348 into PaddlePaddle:develop Feb 14, 2022
@baoachun
Copy link
Contributor

Hi @sfraczek, QA found that this pr caused the test_quant2_int8_lstm_mkldnn single test execution to fail. Chould you please take a look?
https://xly.bce.baidu.com/paddlepaddle/paddle/newipipe/detail/4797233/job/11924197
图片

@sfraczek
Copy link
Author

Thank you very much for pointing this out, I will investigate this test. I am already looking into what this PR caused. I made one follow up fix to an error that was not caught by our unit tests in #39593.

@sfraczek
Copy link
Author

I have reverted the change that caused this PTQ lstm problem on pr #39593.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

4 participants