text
stringlengths 0
4.99k
|
---|
initial_model = keras.Sequential(
|
[
|
keras.Input(shape=(250, 250, 3)),
|
layers.Conv2D(32, 5, strides=2, activation="relu"),
|
layers.Conv2D(32, 3, activation="relu", name="my_intermediate_layer"),
|
layers.Conv2D(32, 3, activation="relu"),
|
]
|
)
|
feature_extractor = keras.Model(
|
inputs=initial_model.inputs,
|
outputs=initial_model.get_layer(name="my_intermediate_layer").output,
|
)
|
# Call feature extractor on test input.
|
x = tf.ones((1, 250, 250, 3))
|
features = feature_extractor(x)
|
Transfer learning with a Sequential model
|
Transfer learning consists of freezing the bottom layers in a model and only training the top layers. If you aren't familiar with it, make sure to read our guide to transfer learning.
|
Here are two common transfer learning blueprint involving Sequential models.
|
First, let's say that you have a Sequential model, and you want to freeze all layers except the last one. In this case, you would simply iterate over model.layers and set layer.trainable = False on each layer, except the last one. Like this:
|
model = keras.Sequential([
|
keras.Input(shape=(784))
|
layers.Dense(32, activation='relu'),
|
layers.Dense(32, activation='relu'),
|
layers.Dense(32, activation='relu'),
|
layers.Dense(10),
|
])
|
# Presumably you would want to first load pre-trained weights.
|
model.load_weights(...)
|
# Freeze all layers except the last one.
|
for layer in model.layers[:-1]:
|
layer.trainable = False
|
# Recompile and train (this will only update the weights of the last layer).
|
model.compile(...)
|
model.fit(...)
|
Another common blueprint is to use a Sequential model to stack a pre-trained model and some freshly initialized classification layers. Like this:
|
# Load a convolutional base with pre-trained weights
|
base_model = keras.applications.Xception(
|
weights='imagenet',
|
include_top=False,
|
pooling='avg')
|
# Freeze the base model
|
base_model.trainable = False
|
# Use a Sequential model to add a trainable classifier on top
|
model = keras.Sequential([
|
base_model,
|
layers.Dense(1000),
|
])
|
# Compile & train
|
model.compile(...)
|
model.fit(...)
|
If you do transfer learning, you will probably find yourself frequently using these two patterns.
|
That's about all you need to know about Sequential models!
|
To find out more about building models in Keras, see:
|
Guide to the Functional API
|
Guide to making new Layers & Models via subclassingThe Functional API
|
Author: fchollet
|
Date created: 2019/03/01
|
Last modified: 2020/04/12
|
Description: Complete guide to the functional API.
|
View in Colab - GitHub source
|
Setup
|
import numpy as np
|
import tensorflow as tf
|
from tensorflow import keras
|
from tensorflow.keras import layers
|
Introduction
|
The Keras functional API is a way to create models that are more flexible than the tf.keras.Sequential API. The functional API can handle models with non-linear topology, shared layers, and even multiple inputs or outputs.
|
The main idea is that a deep learning model is usually a directed acyclic graph (DAG) of layers. So the functional API is a way to build graphs of layers.
|
Consider the following model:
|
(input: 784-dimensional vectors)
|
[Dense (64 units, relu activation)]
|
[Dense (64 units, relu activation)]
|
[Dense (10 units, softmax activation)]
|
(output: logits of a probability distribution over 10 classes)
|
This is a basic graph with three layers. To build this model using the functional API, start by creating an input node:
|
inputs = keras.Input(shape=(784,))
|
The shape of the data is set as a 784-dimensional vector. The batch size is always omitted since only the shape of each sample is specified.
|
If, for example, you have an image input with a shape of (32, 32, 3), you would use:
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.