Skip to content

Conversation

@Xing-lil
Copy link
Contributor

@Xing-lil Xing-lil commented Apr 1, 2025

PR Category

Auto Parallel

PR Types

Bug fixes

Description

Fix static save dynamic load bug in pir.
Remove redundant _is_initialized check that skips param_state matching.
The param becomes initialized after param.set_value(state) in the following lines.
Pcard-70448

@paddle-bot
Copy link

paddle-bot bot commented Apr 1, 2025

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@Xing-lil Xing-lil requested a review from Copilot April 2, 2025 02:41
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR fixes a bug in the static save and dynamic load process by removing a redundant _is_initialized check during parameter state matching in the Auto Parallel module.

  • Removed unnecessary condition that skipped uninitialized parameter tensors
  • Ensured that parameter state is updated after param.set_value(state)
Comments suppressed due to low confidence (1)

python/paddle/nn/layer/layers.py:2227

  • Consider adding tests to verify that removing the _is_initialized() check does not lead to issues when uninitialized parameters are present.
for key, param in self._state_dict_impl(use_hook=False).items(): 
@Xing-lil Xing-lil requested a review from liym27 April 2, 2025 02:42
Copy link
Contributor

@liym27 liym27 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@liym27 liym27 merged commit b1ade74 into PaddlePaddle:develop Apr 2, 2025
34 checks passed
YqGe585 pushed a commit to YqGe585/Paddle that referenced this pull request May 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

2 participants