PyTorch, nn.Sequential(), access weights of a specific module in nn.Sequential()
An easy way to access the weights is to use the state_dict()
of your model.
This should work in your case:
for k, v in model_2.state_dict().iteritems():
print("Layer {}".format(k))
print(v)
Another option is to get the modules()
iterator. If you know beforehand the type of your layers this should also work:
for layer in model_2.modules():
if isinstance(layer, nn.Linear):
print(layer.weight)
From the PyTorch forum, this is the recommended way:
model_2.layer[0].weight