How Dilated Convolutions Preserve Image Size While Expanding Receptive Field
This article explains the concept, mathematics, and practical PyTorch implementation of dilated (or atrous) convolutions, showing how to keep image dimensions unchanged while dramatically increasing the receptive field and discussing their advantages and typical applications.
1. Concept of Dilated Convolution
Dilated convolutions, also called atrous convolutions, introduce a dilation rate parameter that expands the kernel without increasing the number of parameters, thereby enlarging the receptive field.
2. Diagram
Examples:
Standard convolution, dilation=1, receptive field 3×3=9
Dilated convolution, dilation=2, receptive field 7×7=49
Dilated convolution, dilation=4, receptive field 16×16=256
3. Receptive Field Concept
The receptive field of a layer is the region of the original image that influences a single output pixel.
Key Formula
The effective kernel size after dilation is calculated as:
Dilated kernel size = dilation × (kernel_size‑1) + 1
PyTorch Example
In PyTorch there is no padding='SAME' option; padding must be set manually. The output size follows the same rule as TensorFlow's VALID padding:
Output = (W‑F+2P)/S + 1
For a 19×19 input, a 3×3 kernel, stride = 1, and dilation = 6, the dilated kernel size becomes 13. Solving (19‑13+2P)/1 + 1 = 19 yields P = 6 , which keeps the spatial dimensions unchanged.
4. Advantages of Dilated Convolution
It expands the receptive field without increasing the number of parameters, allowing the network to capture broader context while preserving computational cost.
5. Application Areas
Typical uses include image inpainting, semantic segmentation, and speech synthesis.
Python Programming Learning Circle
A global community of Chinese Python developers offering technical articles, columns, original video tutorials, and problem sets. Topics include web full‑stack development, web scraping, data analysis, natural language processing, image processing, machine learning, automated testing, DevOps automation, and big data.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.