RUDN University mathematicians reduced neural network size six times without post-training

RUDN University mathematicians reduced neural network size six times without post-training

A team of mathematicians from RUDN University found a way to reduce the size of a trained neural network six times without spending additional resources on re-training it. The approach is based on finding a correlation between the weights of neural connections in the initial system and its simplified version.

The structures of artificial neural networks and neurons in a living organism are based on the same principles. Nodes in a network are interconnected; some of them receive a signal, and some transmit it by activating or suppressing the next element in the chain. The processing of any signal (for example, an image or a sound) requires a lot of network elements and connections coming from them. However, computer models have limited capacity and storage volume. To work with large data volumes, specialists have to invent different ways to lower capacity requirements, including the so-called quantization. It helps reduce the consumption of resources but requires system re-training. A team of mathematicians from RUDN University found out that the latter step could be avoided.

“Several years ago we carried out efficient and cost-effective quantization of weights in a Hopfield network. It is an associative memory network with symmetrical connections between elements that are formed following Hebb’s rule. In the course of its operation, the activity of the network is reduced to a certain equilibrium state, and when it is reached, a task is considered solved. The insights obtained in that study were later applied to feedforward deep learning networks that are very popular in image recognition today. As a rule, these networks require re-training after quantization, but we found a way to avoid it,” said Iakov Karandashev, PhD, an Assistant Professor at the Nikolskii Mathematical Institute, RUDN University.

The main idea behind the simplification of artificial neural networks is the so-called quantization of weights, i.e. reducing the number of bits per each weight. Quantization provides for the averaging of signal: for example, if it is applied to an image, all pixels representing different shades of the same color will become identical. Mathematically, it means that neural connections that are similar by certain parameters should have the same weight (or importance) expressed as a number.

A team of mathematicians from RUDN University carried out calculations and created formulae that effectively establish correlations between the weights in a neural network before and after quantization. Based on them, the scientists developed algorithms using which a trained neural network could classify images. In their experiment, the mathematicians used a text package of 50 thousand photos that could be divided into 1,000 groups. After training, the network was quantized using the new method and not re-trained. Then, the results were compared to other quantization algorithms.

“After quantization, the classification accuracy decreased by only 1%, but the required storage volume was reduced six times. Experiments show that our network doesn’t need re-training due to a strong correlation between initial and quantized weights. This approach could help save resources when completing time-sensitive tasks or working on mobile devices,” added Iakov Karandashev from RUDN University.

The results of the work were published in the Optical Memory and Neural Networks journal

News
All news
Science
20 Sep
RUDN University mathematician determined the conditions for the coexistence of three species in the wild

RUDN University mathematician together with colleagues from India and France for the first time studied in detail the system of coexistence of three species of living creatures in the wild. The results help to understand what parameters determine the extinction andли survivalof species, and how the number of species changes in space and time.

Science
15 Sep
RUDN University Chemist Created Coordination Polymers Films with up to 99.99% antibacterial efficiency

RUDN University chemist with his colleagues from Portugal has developed two types of coating based on new coordination polymers with silver. Both compounds were successfully tested against four common pathogens.

Science
12 Sep
RUDN scientists have improved titanium dental implants with graphene nanosloyers

RUDN researchers have created and tested a method for processing titanium dental implants. It turned out that theanoslos of graphene on the surface of titanium improve its interaction with stem cells,which are placed on the implant so that it better "takes root".Thanks to this method ofprocessing, stem cells are better kept on the surface, multiply and turn into the desired cells.