STRUCTURAL-PARAMETRIC SYNTHESIS OF NEURAL NETWORK ENSEMBLE BASED ON THE ESTIMATION OF INDIVIDUAL CONTRIBUTION

Authors

  • O. I. Chumachenko National Technical University of Ukraine “Ihor Sikorsky Kyiv Polytechnic Institute”
  • K. D. Riazanovskiy National Technical University of Ukraine “Ihor Sikorsky Kyiv Polytechnic Institute”

DOI:

https://doi.org/10.18372/1990-5548.59.13642

Keywords:

Structural-parametric synthesis, neural networks, ensemble, individual contribution, classification

Abstract

The article presents the structural-parametric synthesis of an ensemble of neural networks of various architectures based on their individual contribution. Topologies and learning algorithms for each classifier are considered. It is described the algorithm for calculating the individual contribution of each network and the algorithm for selecting networks in the ensemble according to the criteria of accuracy and diversity. In order to simplify the structure of the ensemble, the Complementary Measure method was used. The results of learning of classifiers on training bootstrap samples are presented. The obtained results of the ensemble are compared with the corresponding results of each neural network included in the ensemble separately.

Author Biographies

O. I. Chumachenko, National Technical University of Ukraine “Ihor Sikorsky Kyiv Polytechnic Institute”

Technical Cybernetic Department

Candidate of Science (Engineering). Assosiate Professor

orcid.org/0000-0003-3006-7460

K. D. Riazanovskiy, National Technical University of Ukraine “Ihor Sikorsky Kyiv Polytechnic Institute”

Technical Cybernetic Department

Undergraduate student

References

Christopher M. Bishop, “Feed-forward Network Functions,” in Pattern Recognition and Machine Learning, Springer, 2006, pp. 227–232.

S. Nikolenko, A. Kadurin, and E. Archangelskaya, "Preliminaries, or the course of the young fighter," in Deep Learning, SPB.: Piter, 2018, ch. 2.3, pp. 63–69.

D. Rumelhart, G. Hinton, and R. Williams, "Learning representations by back-propagating errors," Nature, 1986, vol. 323, pp. 533–536.

Diederik P. Kingma, and Ba Jimmy, Adam: A Method for Stochastic Optimization. [Online]. Available: https://arxiv.org/abs/1412.6980.

D. S. Broomhead and David Lowe, “Radial basis functions, multi-variable functional interpolation and adaptive networks,” Royal signals and radar establishment, United Kingdom, 1988.

Е. V. Bodyanskiy and О. G. Rudenko, "Radial basis networks," in Artificial neural networks: architecture, training, applications, pp. 35–40.

David MacKay, “Chapter 20. An Example Inference Task: Clustering” in Information Theory, Inference and Learning Algorithms, Cambridge University Press, 2003, pp. 284–292.

Е. V. Bodyanskiy and О. G. Rudenko, "Counter propagatiton neural networks," in Artificial neural networks: architecture, training, applications, pp. 275–281.

V. V Kruglov and V. V. Borisov, "Basic concepts of neural networks," in Artificial neural networks. Theory and practice. 2d ed., 2002, ch. 2.3, pp. 58–63.

D. F. Specht, "Probabilistic neural networks," in Neural Networks, vol. 3, pp. 109–118.

Е. V. Bodyanskiy and О. G. Rudenko, "Probabilistic neural networks," in Artificial neural networks: architecture, training, applications, pp. 176–179.

Y. P. Zaychenko, "Fuzzy neural networks in classification tasks," in Fuzzy models and methods in intelligent systems. Кyiv: Izdatelskiy dom "Slovo", 2008, pp 156–194.

Domingos Pedro, Michael Pazzani, “On the optimality of the simple Bayesian classifier under zero-one loss,” in Machine Learning, 1997, pp. 103–137.

Downloads

Issue

Section

COMPUTER-AIDED DESIGN SYSTEMS