Inspecting intermediate values in a neural network
You can also tap off the intermediate layers by attaching output ports. For example
net = NetInitialize@
NetGraph[{LongShortTermMemoryLayer[5], SequenceLastLayer[],
LinearLayer[2],
SoftmaxLayer[]}, {NetPort["Input"] ->
1 -> 2 -> 3 -> 4 -> NetPort["Output"], 1 -> NetPort["layer1"],
2 -> NetPort["layer2"], 3 -> NetPort["layer3"]},
"Input" -> {"Varying", 3}]
Then
net[{{1, 2, 3}, {0, 1, 3}}]
(* <|"Output" -> {0.716093, 0.283907},
"layer1" -> {{0.0242801, 0.00128676, -0.220564, 0.010784,
0.369972}, {-0.0774157, 0.0137297, -0.456915, 0.0116218,
0.665373}},
"layer2" -> {-0.0774157, 0.0137297, -0.456915, 0.0116218, 0.665373},
"layer3" -> {0.748411, -0.176754}|> *)
You don't have to use all the output ports in the training process. For example, this uses only the original input and output ports:
NetTrain[net, {<|"Input" -> {{1, 2, 3}, {0, 1, 3}},
"Output" -> {0.5281522274017334`, 0.471847802400589`}|>},
"Output" -> MeanSquaredLossLayer[]]
This works for NetGraph
as well.
Here is a complete example using MNIST where the 10th layer is tapped off:
lenet = NetGraph[{
ConvolutionLayer[20, 5], Ramp, PoolingLayer[2, 2],
ConvolutionLayer[50, 5], Ramp, PoolingLayer[2, 2],
FlattenLayer[], 500, Ramp, 10,
SoftmaxLayer[]}, {NetPort["Input"] ->
1 -> 2 ->
3 -> 4 ->
5 -> 6 -> 7 -> 8 -> 9 -> 10 -> 11 -> NetPort["Output"],
10 -> NetPort["layer10"]},
"Input" -> NetEncoder[{"Image", {28, 28}, "Grayscale"}],
"Output" -> NetDecoder[{"Class", Range[0, 9]}]
]
trainingData2 = <|"Input" -> #[[1]], "Output" -> #[[2]]|> & /@
RandomSample[trainingData, 1000];
trained = NetTrain[lenet, trainingData2, "Output" -> CrossEntropyLossLayer["Index"]]
trained[trainingData2[[1, 1]]]
(* <|"Output" -> 5,
"layer10" -> {-2.34726, -9.10603, -0.501655, -0.602975, 0.368083,
6.48835, -5.21255, -1.2072, 1.68743, 5.18232}|> *)
Not really "convenient", but you can get the layers of a NetChain
using Normal
:
net = NetInitialize@NetChain[{3, Ramp, 4, Ramp, 1}, "Input" -> "Real"];
layers = Normal[net]
and then construct a NetGraph
with the same layers, and an output port for each layer:
g = NetGraph[layers,
Join[
(* connections between adjacent layers *)
# -> # + 1 & /@ Range[Length[layers] - 1],
(* output ports *)
# -> NetPort[StringTemplate["Layer_``"][#]] & /@
Range[Length[layers]]
]]
Result:
net[1]
{0.168289}
g[1]
<|"Layer_1" -> {-0.792354, 0.213135, 0.565724}, "Layer_2" -> {0., 0.213135, 0.565724}, "Layer_3" -> {-0.422488, 0.126484, 0.000964101, 0.0990122}, "Layer_4" -> {0., 0.126484, 0.000964101, 0.0990122}, "Layer_5" -> {0.168289}|>