- Hands-On Neural Networks
- Leonardo De Marchi Laura Mitchell
- 61字
- 2025-04-04 14:15:17
Keras implementation
In Keras, it's possible to specify the activations through either an activation layer or through the activation argument supported by all forward layers:
from keras.layers import Activation, Dense
model.add(Dense(32))
model.add(Activation('tanh'))
This is equivalent to the following command:
model.add(Dense(32, activation='tanh'))
You can also pass an element-wise TensorFlow/Theano/CNTK function as an activation:
from keras import backend as K
model.add(Dense(32, activation=K.tanh))