Staged training of Neural Networks
So actually this is already implied by the documentation for NetInitialize
, which says "gives a net in which all uninitialized learnable parameters in net have been given initial values". So when you retrain, you're starting with weights, and the internal NetInitialize
does nothing (you can use the second argument of 'All' to force it to overwrite existing weights).
But in any case I've added a note to NetTrain
to mention that it doesn't re-initialize pre-existing weights. Thanks for drawing my attention to this possible confusion.