Skip to content

Conversation

@zhwesky2010
Copy link
Contributor

@zhwesky2010 zhwesky2010 commented Nov 12, 2021

PR types

New features

PR changes

APIs

Describe

paddle.nn.initializer.Orthogonal

infoflow 2021-11-18 21-39-01

weight_attr = paddle.ParamAttr(initializer=paddle.nn.initializer.Orthogonal()) linear = paddle.nn.Linear(10, 15, weight_attr=weight_attr) # linear.weight: X * X' = I 各行之间相互正交,且为单位向量 weight_attr = paddle.ParamAttr(initializer=paddle.nn.initializer.Orthogonal()) linear = paddle.nn.Linear(15, 10, weight_attr=weight_attr) # linear.weight: X' * X = I 各列之间相互正交,且为单位向量 weight_attr = paddle.ParamAttr(initializer=paddle.nn.initializer.Orthogonal()) linear = paddle.nn.Linear(10, 10, weight_attr=weight_attr) # 各行、各列之间相互正交,且均为单位向量 

paddle.nn.initializer.calculate_gain

infoflow 2021-11-18 21-42-23

import paddle gain = paddle.nn.initializer.calculate_gain('tanh') initializer = paddle.nn.intializer.Orthogonal(gain) 
@paddle-bot-old
Copy link

Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

param = 0.01
else:
assert isinstance(param, (bool, int, float))
recommended_gain = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shall we support calculate gain of selu?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done



class TestConstantInitializer(unittest.TestCase):
def test_calculate_gain(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shall we add test case of relu and selu?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thx

jeff41404
jeff41404 previously approved these changes Nov 18, 2021
Copy link
Contributor

@jeff41404 jeff41404 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

XiaoguangHu01
XiaoguangHu01 previously approved these changes Nov 18, 2021
Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG API

Args:
nonlinearity: nonlinearity function.
param: optional parameter for somme nonlinearity function
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

param(type, optional);
default is None, means xxx

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done, thx

import paddle
gain = paddle.nn.initializer.calculate_gain('tanh')
initializer = paddle.nn.initializer.Orthogonal(gain)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

最好能以注释给一下运行的结果

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

TCChenlong
TCChenlong previously approved these changes Nov 19, 2021
Copy link
Contributor

@TCChenlong TCChenlong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for API Docs

Copy link
Contributor

@TCChenlong TCChenlong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for API docs

Copy link
Contributor

@XiaoguangHu01 XiaoguangHu01 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LG API

@zhwesky2010 zhwesky2010 merged commit 62ad359 into PaddlePaddle:develop Nov 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

4 participants