What does `layer.get_weights()` return?
Here is a working example.
import numpy as np
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation, Flatten
X_train=np.random.rand(1,10)
Y_train=2*X_train
input_dim = X_train.shape[1]
model=Sequential()
model.add(Dense(20, input_dim=10))
model.add(Dense(10, activation='softmax'))
weight_origin_0=model.layers[0].get_weights()[0]
weight_origin_1=model.layers[1].get_weights()[0]
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(X_train, Y_train, batch_size=1, nb_epoch=10, verbose=1)
print(weight_origin_0-model.layers[0].get_weights()[0]) #the first layer
print(weight_origin_1-model.layers[1].get_weights()[0]) #the second layer
For the question of layer.get_weights()
:
I ran some tests on this issue and checked the source codes. I found that the Dense
layer is a subclass of Layer
and its weights, which is a type of python list
has two elements weight of the layer stored at layer.get_weights()[0]
and the bias
is stored at layer.get_weights()[1]
.
There's one thing to note that, bias
can be disabled during defining the layer: model.add(Dense(503,init='normal',activation='relu',
bias=False
))
. In that case, the list layer.get_weights()
has only one element. If you set the bias
attribute as False
after defining it, there will still be an element for bias
and it would be updated after you fitting the model.
For the question of not updating:
I set up a Sequential model with only one dense layer:
def mlp_2():
model=Sequential()
model.add(Dense(10, input_dim=784, activation='softmax', bias =False))
return model
Then I use the same way above to compile and fit it. This is what I got:
It still seems not update the weight, however, we can tell the weight is definately changed. Because the accuracy is increasing. I think the only explanation is updates on the first dense
layer (which you define input_dim
) is too small for Keras to printout. I didn't check the more precise value of the weights, it would be great if someone could confrim it.