Causal padding in keras

This is a great concise explanation about what is "causal" padding:

One thing that Conv1D does allow us to specify is padding="causal". This simply pads the layer's input with zeros in the front so that we can also predict the values of early time steps in the frame:

enter image description here

Dilation just means skipping nodes. Unlike strides which tells you where to apply the kernel next, dilation tells you how to spread your kernel. In a sense, it is equivalent to a stride in the previous layer.

enter image description here

In the image above, if the lower layer had a stride of 2, we would skip (2,3,4,5) and this would have given us the same results.

Credit: Kilian Batzner, Convolutions in Autoregressive Neural Networks


That is convolution type, output at time t only depends on the previous time steps( Less than t). We won't consider the future time steps while getting Conv output. Please check this Wavenet paper gif enter image description here