Skip to content

Conversation

@Baibaifan
Copy link
Contributor

@Baibaifan Baibaifan commented Mar 1, 2022

PR types

Function optimization

PR changes

APIs

Describe

  1. Change ASP sharding option, reducing unnecessary distributed options and improve ease of use.
  2. It has been confirmed with minghaoBD.
import paddle from paddle.static import sparsity paddle.enable_static() main_program = paddle.static.Program() startup_program = paddle.static.Program() with paddle.static.program_guard(main_program, startup_program): input_data = paddle.static.data(name='data', shape=[None, 128]) label = paddle.static.data(name='label', shape=[None, 10]) hidden = paddle.static.nn.fc(x=input_data, num_flatten_dims=-1, size=32, activation=None, name="need_sparse_fc") hidden = paddle.static.nn.fc(x=hidden, num_flatten_dims=-1, size=32, activation=None, name="need_dense_fc") prob = paddle.static.nn.fc(x=hidden, num_flatten_dims=-1, size=10, activation=None) loss = paddle.mean(paddle.nn.functional.square_error_cost(prob, label)) # Setup exluded layers out from ASP workflow. # Please note, excluded_layers must be set before calling `optimizer.minimize()`. sparsity.set_excluded_layers(main_program, ["need_dense_fc"]) optimizer = paddle.optimizer.SGD(learning_rate=0.1) optimizer = paddle.static.amp.decorate(optimizer ) # Calling sparsity.decorate() to wrap minimize() in optimizer, which # will insert necessary masking operations for ASP workflow. optimizer = sparsity.decorate(optimizer) optimizer.minimize(loss, startup_program) device = paddle.device.get_device() place = paddle.set_device(device) exe = paddle.static.Executor(place) exe.run(startup_program) # Must call `exe.run(startup_program)` first before calling `sparsity.prune_model` sparsity.prune_model(main_program, mask_algo='mask_2d_best') 
@paddle-bot-old
Copy link

paddle-bot-old bot commented Mar 1, 2022

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@Baibaifan Baibaifan force-pushed the change_ASP_sharding_option branch from 77014e6 to 5d78200 Compare March 1, 2022 11:15
@Baibaifan Baibaifan force-pushed the change_ASP_sharding_option branch from 5d78200 to e8b47a7 Compare March 1, 2022 13:14
@Baibaifan Baibaifan force-pushed the change_ASP_sharding_option branch from e8b47a7 to 397a226 Compare March 2, 2022 06:32
Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG API

@Baibaifan Baibaifan merged commit 815f7a6 into PaddlePaddle:develop Mar 3, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants