![]() |
A layer that uses einsum
as the backing computation.
Inherits From: Layer
, Operation
tf.keras.layers.EinsumDense( equation, output_shape, activation=None, bias_axes=None, kernel_initializer='glorot_uniform', bias_initializer='zeros', kernel_regularizer=None, bias_regularizer=None, kernel_constraint=None, bias_constraint=None, lora_rank=None, **kwargs )
This layer can perform einsum calculations of arbitrary dimensionality.
Examples:
Biased dense layer with einsums
This example shows how to instantiate a standard Keras dense layer using einsum operations. This example is equivalent to keras.layers.Dense(64, use_bias=True)
.
layer = keras.layers.EinsumDense("ab,bc->ac",
output_shape=64,
bias_axes="c")
input_tensor = keras.Input(shape=[32])
output_tensor = layer(input_tensor)
output_tensor.shape
(None, 64)
Applying a dense layer to a sequence
This example shows how to instantiate a layer that applies the same dense operation to every element in a sequence. Here, the output_shape
has two values (since there are two non-batch dimensions in the output); the first dimension in the output_shape
is None
, because the sequence dimension b
has an unknown shape.
layer = keras.layers.EinsumDense("abc,cd->abd",
output_shape=(None, 64),
bias_axes="d")
input_tensor = keras.Input(shape=[32, 128])
output_tensor = layer(input_tensor)
output_tensor.shape
(None, 32, 64)
Applying a dense layer to a sequence using ellipses
This example shows how to instantiate a layer that applies the same dense operation to every element in a sequence, but uses the ellipsis notation instead of specifying the batch and sequence dimensions.
Because we are using ellipsis notation and have specified only one axis, the output_shape
arg is a single value. When instantiated in this way, the layer can handle any number of sequence dimensions - including the case where no sequence dimension exists.
layer = keras.layers.EinsumDense("...x,xy->...y",
output_shape=64,
bias_axes="y")
input_tensor = keras.Input(shape=[32, 128])
output_tensor = layer(input_tensor)
output_tensor.shape
(None, 32, 64)
Methods
enable_lora
enable_lora( rank, a_initializer='he_uniform', b_initializer='zeros' )
from_config
@classmethod
from_config( config )
Creates a layer from its config.
This method is the reverse of get_config
, capable of instantiating the same layer from the config dictionary. It does not handle layer connectivity (handled by Network), nor weights (handled by set_weights
).
Args | |
---|---|
config | A Python dictionary, typically the output of get_config. |
Returns | |
---|---|
A layer instance. |
quantized_build
quantized_build( input_shape, mode )
symbolic_call
symbolic_call( *args, **kwargs )
Class Variables | |
---|---|
QUANTIZATION_MODE_ERROR_TEMPLATE | ("Invalid quantization mode. Expected one of ('int8', 'float8'). Received: " 'quantization_mode={mode}') |