![]() |
Cell class for SimpleRNN.
Inherits From: Layer
, Operation
tf.keras.layers.SimpleRNNCell( units, activation='tanh', use_bias=True, kernel_initializer='glorot_uniform', recurrent_initializer='orthogonal', bias_initializer='zeros', kernel_regularizer=None, recurrent_regularizer=None, bias_regularizer=None, kernel_constraint=None, recurrent_constraint=None, bias_constraint=None, dropout=0.0, recurrent_dropout=0.0, seed=None, **kwargs )
This class processes one step within the whole time sequence input, whereas keras.layer.SimpleRNN
processes the whole sequence.
Example:
inputs = np.random.random([32, 10, 8]).astype(np.float32) rnn = keras.layers.RNN(keras.layers.SimpleRNNCell(4)) output = rnn(inputs) # The output has shape `(32, 4)`. rnn = keras.layers.RNN( keras.layers.SimpleRNNCell(4), return_sequences=True, return_state=True ) # whole_sequence_output has shape `(32, 10, 4)`. # final_state has shape `(32, 4)`. whole_sequence_output, final_state = rnn(inputs)
Methods
from_config
@classmethod
from_config( config )
Creates a layer from its config.
This method is the reverse of get_config
, capable of instantiating the same layer from the config dictionary. It does not handle layer connectivity (handled by Network), nor weights (handled by set_weights
).
Args | |
---|---|
config | A Python dictionary, typically the output of get_config. |
Returns | |
---|---|
A layer instance. |
get_dropout_mask
get_dropout_mask( step_input )
get_initial_state
get_initial_state( batch_size=None )
get_recurrent_dropout_mask
get_recurrent_dropout_mask( step_input )
reset_dropout_mask
reset_dropout_mask()
Reset the cached dropout mask if any.
The RNN layer invokes this in the call()
method so that the cached mask is cleared after calling cell.call()
. The mask should be cached across all timestep within the same batch, but shouldn't be cached between batches.
reset_recurrent_dropout_mask
reset_recurrent_dropout_mask()
symbolic_call
symbolic_call( *args, **kwargs )