RUDN University mathematicians reduced neural network size six times without post-training

RUDN University mathematicians reduced neural network size six times without post-training

A team of mathematicians from RUDN University found a way to reduce the size of a trained neural network six times without spending additional resources on re-training it. The approach is based on finding a correlation between the weights of neural connections in the initial system and its simplified version.

The structures of artificial neural networks and neurons in a living organism are based on the same principles. Nodes in a network are interconnected; some of them receive a signal, and some transmit it by activating or suppressing the next element in the chain. The processing of any signal (for example, an image or a sound) requires a lot of network elements and connections coming from them. However, computer models have limited capacity and storage volume. To work with large data volumes, specialists have to invent different ways to lower capacity requirements, including the so-called quantization. It helps reduce the consumption of resources but requires system re-training. A team of mathematicians from RUDN University found out that the latter step could be avoided.

“Several years ago we carried out efficient and cost-effective quantization of weights in a Hopfield network. It is an associative memory network with symmetrical connections between elements that are formed following Hebb’s rule. In the course of its operation, the activity of the network is reduced to a certain equilibrium state, and when it is reached, a task is considered solved. The insights obtained in that study were later applied to feedforward deep learning networks that are very popular in image recognition today. As a rule, these networks require re-training after quantization, but we found a way to avoid it,” said Iakov Karandashev, PhD, an Assistant Professor at the Nikolskii Mathematical Institute, RUDN University.

The main idea behind the simplification of artificial neural networks is the so-called quantization of weights, i.e. reducing the number of bits per each weight. Quantization provides for the averaging of signal: for example, if it is applied to an image, all pixels representing different shades of the same color will become identical. Mathematically, it means that neural connections that are similar by certain parameters should have the same weight (or importance) expressed as a number.

A team of mathematicians from RUDN University carried out calculations and created formulae that effectively establish correlations between the weights in a neural network before and after quantization. Based on them, the scientists developed algorithms using which a trained neural network could classify images. In their experiment, the mathematicians used a text package of 50 thousand photos that could be divided into 1,000 groups. After training, the network was quantized using the new method and not re-trained. Then, the results were compared to other quantization algorithms.

“After quantization, the classification accuracy decreased by only 1%, but the required storage volume was reduced six times. Experiments show that our network doesn’t need re-training due to a strong correlation between initial and quantized weights. This approach could help save resources when completing time-sensitive tasks or working on mobile devices,” added Iakov Karandashev from RUDN University.

The results of the work were published in the Optical Memory and Neural Networks journal

All news
22 Oct
A Chemist from RUDN University Developed a New Method for Combating Antibiotic Resistance in Microbes

Bacteria in biofilms are 1,000 times more resistant to antibiotics, disinfectants, mechanical treatment, and other types of stress. A chemist from RUDN University suggested a method to prevent the formation of biofilms and reduce the resistance of bacteria to antimicrobial medications. This might help increase the efficiency of antibacterial treatment in the food industry, medicine, and agriculture.

20 Oct
RUDN University Professor Suggested how to Clean Up Space Debris

A specialist in spacecraft movement control analyzed the process of placing vehicle stages, boosters, and other space debris into the so-called disposal orbit and suggested cleaning lower orbits up with a spacecraft that has modules with engine units on board. These modules will attach to space debris objects and move them away. As for the geostationary orbit, a preferable way to clean it up would be a towing spacecraft that transports space debris objects into the disposal orbit.

14 Oct
A Biologist from RUDN University Found Sex Differences in Inflammatory Reactions in Rat Pups

A biologist from RUDN University studied the development of the immune response in prepubertal male and female animals. According to her, the severity and mortality of infectious and inflammatory diseases at this age depend not on the sex hormones, but mainly on the chromosome set or karyotype.