Tag

weight initialization

1 views collected around this technical thread.

IT Services Circle
IT Services Circle
May 2, 2025 · Artificial Intelligence

Understanding Gradient Vanishing in Deep Neural Networks and How to Mitigate It

The article explains why deep networks suffer from gradient vanishing—especially when using sigmoid or tanh activations—covers the underlying mathematics, compares activation functions, and presents practical techniques such as proper weight initialization, batch normalization, residual connections, and code examples to visualize the phenomenon.

Deep LearningNeural NetworksResNet
0 likes · 7 min read
Understanding Gradient Vanishing in Deep Neural Networks and How to Mitigate It
Python Programming Learning Circle
Python Programming Learning Circle
Dec 23, 2021 · Artificial Intelligence

Extracting and Initializing Parameters in a PyTorch CNN Model

This article explains how to use PyTorch's named_parameters() and parameters() to retrieve a network's layer names and weights, demonstrates printing each parameter, and shows several common weight‑initialization techniques for convolutional and linear layers within a CNN architecture.

CNNDeep LearningPyTorch
0 likes · 6 min read
Extracting and Initializing Parameters in a PyTorch CNN Model