Mathematicians from RUDN University and the Free University of Berlin proposed a new way of using neural networks for working with noisy high-dimensional data

Mathematicians from RUDN University and the Free University of Berlin proposed a new way of using neural networks for working with noisy high-dimensional data

Mathematicians from RUDN University and the Free University of Berlin have proposed a new approach to studying the probability distributions of observed data using artificial neural networks. The new approach works better with so-called outliers, i.e. input data objects that deviate significantly from the overall sample.

The restoration of the probability distribution of observed data by artificial neural networks is the most important part of machine learning. The probability distribution not only allows us to predict the behaviour of the system under study, but also to quantify the uncertainty with which forecasts are made. The main difficulty is that, as a rule, only the data are observed, but their exact probability distributions are not available. To solve this problem, Bayesian and other similar approximate methods are used. But their use increases the complexity of a neural network and therefore makes its training more complicated.

RUDN University and the Free University of Berlin mathematicians used deterministic weights in neural networks, which would help overcome the limitations of Bayesian methods. They developed a formula that allows one to correctly estimate the variance of the distribution of observed data. The proposed model was tested on different data: synthetic and real; on data containing outliers and on data from which the outliers were removed. The new method allows restoration of probability distributions with accuracy previously unachievable.

The mathematicians of RUDN University and the Free University of Berlin used deterministic weights for neural networks and used the networks outputs to encode the distribution of latent variables for the desired marginal distribution. An analysis of the training dynamics of such networks allowed them to obtain a formula that correctly estimates the variance of observed data, despite the presence of outliers in the data. The proposed model was tested on different data: synthetic and real. The new method allows restoring probability distributions with higher accuracy compared with other modern methods. Accuracy was assessed using the AUC method (area under the curve is the area under the graph that allows making assessment of the mean square error of the predictions depending on the sample size estimated by the network as “reliable”; the higher the AUC score, the better the predictions).

The article was published in the journal Artificial Intelligence.

News
All news
Science
11 Oct
Chemist Benjamin List: the Nobel Prize in chemistry and a visit to RUDN University

October 6, the Nobel Committee announced the 2021 Nobel Prize Winners in Chemistry. They were scientists Benjamin List (Germany) and David Macmillan (USA) for new methods of molecular synthesis. RUDN chemists know Professor Liszt personally — in 2019, the researcher came to participate in the university conference.

Science
11 Oct
Nobel Prize in Medicine awarded for receptor research

The Nobel Committee in Stockholm has named the laureates in Medicine — David Julius and Ardem Patapoutian. They discovered how temperature or touch, for example, is transformed into human sensations.

Science
07 Oct
A RUDN University Physicist Simplified the Einstein-Lovelock Theory for Black Holes

Allowing for quantum corrections, the Einstein-Lovelock theory describes black holes with an equation that contains an infinite number of terms. However, according to a RUDN University physicist, the geometry of a black hole in this theory can be presented in a compact form, and a limited number of terms can suffice to describe the observed values. This could help scientists study black holes in theories with quantum corrections to Einstein’s equations.