text
stringlengths 0
4.99k
|
---|
To write such a layer, you can simply add a mask=None argument in your call signature. The mask associated with the inputs will be passed to your layer whenever it is available.
|
Here's a simple example below: a layer that computes a softmax over the time dimension (axis 1) of an input sequence, while discarding masked timesteps.
|
class TemporalSoftmax(keras.layers.Layer):
|
def call(self, inputs, mask=None):
|
broadcast_float_mask = tf.expand_dims(tf.cast(mask, "float32"), -1)
|
inputs_exp = tf.exp(inputs) * broadcast_float_mask
|
inputs_sum = tf.reduce_sum(inputs * broadcast_float_mask, axis=1, keepdims=True)
|
return inputs_exp / inputs_sum
|
inputs = keras.Input(shape=(None,), dtype="int32")
|
x = layers.Embedding(input_dim=10, output_dim=32, mask_zero=True)(inputs)
|
x = layers.Dense(1)(x)
|
outputs = TemporalSoftmax()(x)
|
model = keras.Model(inputs, outputs)
|
y = model(np.random.randint(0, 10, size=(32, 100)), np.random.random((32, 100, 1)))
|
Summary
|
That is all you need to know about padding & masking in Keras. To recap:
|
"Masking" is how layers are able to know when to skip / ignore certain timesteps in sequence inputs.
|
Some layers are mask-generators: Embedding can generate a mask from input values (if mask_zero=True), and so can the Masking layer.
|
Some layers are mask-consumers: they expose a mask argument in their __call__ method. This is the case for RNN layers.
|
In the Functional API and Sequential API, mask information is propagated automatically.
|
When using layers in a standalone way, you can pass the mask arguments to layers manually.
|
You can easily write layers that modify the current mask, that generate a new mask, or that consume the mask associated with the inputs.The Sequential model
|
Author: fchollet
|
Date created: 2020/04/12
|
Last modified: 2020/04/12
|
Description: Complete guide to the Sequential model.
|
View in Colab • GitHub source
|
Setup
|
import tensorflow as tf
|
from tensorflow import keras
|
from tensorflow.keras import layers
|
When to use a Sequential model
|
A Sequential model is appropriate for a plain stack of layers where each layer has exactly one input tensor and one output tensor.
|
Schematically, the following Sequential model:
|
# Define Sequential model with 3 layers
|
model = keras.Sequential(
|
[
|
layers.Dense(2, activation="relu", name="layer1"),
|
layers.Dense(3, activation="relu", name="layer2"),
|
layers.Dense(4, name="layer3"),
|
]
|
)
|
# Call model on a test input
|
x = tf.ones((3, 3))
|
y = model(x)
|
is equivalent to this function:
|
# Create 3 layers
|
layer1 = layers.Dense(2, activation="relu", name="layer1")
|
layer2 = layers.Dense(3, activation="relu", name="layer2")
|
layer3 = layers.Dense(4, name="layer3")
|
# Call layers on a test input
|
x = tf.ones((3, 3))
|
y = layer3(layer2(layer1(x)))
|
A Sequential model is not appropriate when:
|
Your model has multiple inputs or multiple outputs
|
Any of your layers has multiple inputs or multiple outputs
|
You need to do layer sharing
|
You want non-linear topology (e.g. a residual connection, a multi-branch model)
|
Creating a Sequential model
|
You can create a Sequential model by passing a list of layers to the Sequential constructor:
|
model = keras.Sequential(
|
[
|
layers.Dense(2, activation="relu"),
|
layers.Dense(3, activation="relu"),
|
layers.Dense(4),
|
]
|
)
|
Its layers are accessible via the layers attribute:
|
model.layers
|
[<tensorflow.python.keras.layers.core.Dense at 0x1024e6710>,
|
<tensorflow.python.keras.layers.core.Dense at 0x13d632ed0>,
|
<tensorflow.python.keras.layers.core.Dense at 0x14c6ddb50>]
|
You can also create a Sequential model incrementally via the add() method:
|
model = keras.Sequential()
|
model.add(layers.Dense(2, activation="relu"))
|
model.add(layers.Dense(3, activation="relu"))
|
model.add(layers.Dense(4))
|
Note that there's also a corresponding pop() method to remove layers: a Sequential model behaves very much like a list of layers.
|
model.pop()
|
print(len(model.layers)) # 2
|
2
|
Also note that the Sequential constructor accepts a name argument, just like any layer or model in Keras. This is useful to annotate TensorBoard graphs with semantically meaningful names.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.