All convolutions inside a dense block are ReLU-activated and use batch normalization. Channel-clever concatenation is just doable if the height and width dimensions of the data continue being unchanged, so convolutions inside of a dense block are all of stride 1. Pooling layers are inserted between dense blocks for https://financefeeds.com/top-4-shiba-inu-alternatives-that-will-gain-50x-in-2025/