Pruning Neural Networks (Based on Concepts from Statistical Physics such as Free Energy and Entropy), 2025
Project Supervisor: Sergey Koltsov
Participants: Anton Surkov, Ksenia Kupitman, Vera Ignatenko, Rayeesa Mehmood
This project focuses on the research and development of neural network pruning algorithms utilizing functions from statistical physics, such as free energy and entropy. Neural network pruning reduces the number of parameters in a network, thereby decreasing computational load and memory requirements for storage and inference, while maintaining an acceptable level of accuracy. This project examines pruning algorithms such as non-structural pruning of weights based on their magnitude (magnitude weight pruning) and activation pruning of neural networks. The pruning procedure is analyzed using the concept of free energy.
Project's publications:
1. Anton Surkov, Sergei Koltcov, Vera Ignatenko, Rayeesa Mehmood, Ksenia Kupitman. Free energy of neural network can predict accuracy after pruning // Physica A: Statistical Mechanics and its Applications. 2025. Vol. 681. Article 131085.
2. Rayeesa Mehmood, Sergei Koltcov, Anton Surkov, Vera Ignatenko. Modeling Pruning as a Phase Transition: A Thermodynamic Analysis of Neural Activations // Computers, Materials and Continua. 2026. Vol. 86. No. 3. Article 99
Have you spotted a typo?
Highlight it, click Ctrl+Enter and send us a message. Thank you for your help!
To be used only for spelling or punctuation mistakes.