All convolutions in a dense block are ReLU-activated and use batch normalization. Channel-sensible concatenation is simply possible if the height and width dimensions of the information continue being unchanged, so convolutions in the dense block are all of stride one. Pooling layers are inserted between dense blocks for further dimensionality redu