Hechth–Nielsen theorem for a modified neural network with diagonal synaptic connections

dc.citation.epage108
dc.citation.issue1
dc.citation.spage101
dc.contributor.affiliationДрогобицький державний педагогічний університет ім. І. Франка
dc.contributor.affiliationНаціональний університет “Львівська політехніка”
dc.contributor.affiliationIvan Franko Drogobych State Pedagogical University
dc.contributor.affiliationLviv Polytechnic National University
dc.contributor.authorПелещак, Р.
dc.contributor.authorЛитвин, В.
dc.contributor.authorПелещак, І.
dc.contributor.authorДорошенко, М.
dc.contributor.authorОливко, Р.
dc.contributor.authorPeleshchak, R.
dc.contributor.authorLytvyn, V.
dc.contributor.authorPeleshchak, I.
dc.contributor.authorDoroshenko, M.
dc.contributor.authorOlyvko, R.
dc.coverage.placenameЛьвів
dc.coverage.placenameLviv
dc.date.accessioned2020-02-27T09:45:16Z
dc.date.available2020-02-27T09:45:16Z
dc.date.created2019-02-26
dc.date.issued2019-02-26
dc.description.abstractУ роботі запропоновано модифіковану тришарову нейронну мережу з архітектурою, яка має тільки діагональні синаптичні зв’язки між нейронами, внаслідок чого отримано трансформовану теорему Хехт–Нільсена. Така архітектура тришарової нейронної мережі (m = 2n + 1 - кількість нейронів прихованого шару нейромережі, n - кількість вхідних образів) дає змогу апроксимувати функцію від n змінних із заданою точністю " > 0 за допомогою однієї операції агрегування. Тришарова нейронна мережа, яка має як діагональні, так і недіагональні синаптичні зв’язки між нейронами, апроксимує функцію від n змінних за допомогою двох операцій агрегування. Крім цього, діагоналізація матриці синаптичних зв’язків приводить до зменшення обчислювального ресурсу і відповідно до зменшення часу налаштування вагових коефіцієнтів синаптичних зв’язків під час навчання нейронної мережі.
dc.description.abstractThe work suggests a modified three-layer neural network with architecture that has only the diagonal synaptic connections between neurons; as a result we obtain the transformed Hecht–Nielsen theorem. This architecture of a three-layer neural network (m = 2n + 1 is the number of neurons in the hidden layer of the neural network, n is the number of input signals) allows us to approximate the function of n variables, with the given accuracy " > 0, using one aggregation operation, whereas a three-layer neural network that has both diagonal and non-diagonal synaptic connections between neurons approximates the function of n variables by means of two aggregation operations. In addition, the matrix diagonalization of the synaptic connections leads to a decrease of computing resources and reduces the time of adjustment of the weight coefficients during the training of a neural network.
dc.format.extent101-108
dc.format.pages8
dc.identifier.citationHechth–Nielsen theorem for a modified neural network with diagonal synaptic connections / R. Peleshchak, V. Lytvyn, I. Peleshchak, M. Doroshenko, R. Olyvko // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2019. — Vol 6. — No 1. — P. 101–108.
dc.identifier.citationenHechth–Nielsen theorem for a modified neural network with diagonal synaptic connections / R. Peleshchak, V. Lytvyn, I. Peleshchak, M. Doroshenko, R. Olyvko // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2019. — Vol 6. — No 1. — P. 101–108.
dc.identifier.urihttps://ena.lpnu.ua/handle/ntb/46145
dc.language.isoen
dc.publisherLviv Politechnic Publishing House
dc.relation.ispartofMathematical Modeling and Computing, 1 (6), 2019
dc.relation.references1. KolmogorovA.N. On the representation of continuous functions of several variables by superpositions of continuous functions of a smaller number of variables. DAN USSR. 108, 2 (1956).
dc.relation.references2. KolmogorovA.N. On the representation of continuous functions of several variables by superpositions of continuous functions of one variable and addition. DAN USSR. 114, 953–956 (1957).
dc.relation.references3. VitushkinA.G., KhenkinG.M. Linear superposition of functions. Russ. Math. Surv. 22, 77–125 (1967).
dc.relation.references4. Slupecki J. The criterion of completeness of multivalent accounting system. C. R. Classe III. 32 (1–3), 102–109 (1939).
dc.relation.references5. McCullochW. S., PittsW. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics. 5 (4), 115–133 (1943).
dc.relation.references6. Rosenblatt F. Principles of Neurodynamics: Perceptrons and the theory of brain mechanisms. Spartan Books. Washington, DC (1961).
dc.relation.references7. MinskyM., Papert S. Perceptrons. Oxford, England: M.I.T. Press. (1969).
dc.relation.references8. CybenkoG. Approximations by superpositions of sigmoidal functions. Math. Control Signals Systems. 2, 303–314 (1989).
dc.relation.references9. FunahashiK. On the approximate realization of continuous mappsngs by neural networks. Neural Networks. 2 (3), 183–192 (1989).
dc.relation.references10. HornickK., StinchcombeM., WhiteH. Multilayer feedforward networks are universal approximators. Neural Networks. 2 (5), 359–366 (1989).
dc.relation.references11. Hecht-NielsenR. Kolmogorov’s mapping neural network existence theorem. IEEE First Annual Int. Conf. on Neural Networks, San Diego. 3, 11–13 (1987).
dc.relation.references12. AlekseevD.V. Approximation of functions of several variables by neural networks. Fundamental and applied mathematics. 15 (3), 9–21 (2009).
dc.relation.references13. PenroseR. Shadows of the Mind: A Search for the Missing Science of Consciousness. Oxford University Press, Inc. New York, NY, USA (1994).
dc.relation.references14. LytvynV., VysotskaV., Peleshchak I., Rishnyak I., PeleshchakR. Time dependence of the output signal morphology for nonlinear oscillator neuron based on Van der Pol model. International Journal of Intelligent Systems and Applications. 4, 8–17 (2018).
dc.relation.references15. PeleshchakR.M., LytvynV.V., Peleshchak I.R. Dynamics of a nonlinear oscillatory neuron under the action of an external non-stationary signal. Radioelectronics, computer science, management. 4, 97–105 (2017).
dc.relation.references16. LytvynV., Peleshchak I., PeleshchakR. The compression of the input images in neural network that using method diagonalization the matrices of synaptic weight connections. 2017 2nd International Conference on Advanced Information and Communication Technologies (AICT). 66–70 (2017).
dc.relation.references17. LytvynV., Peleshchak I., PeleshchakR. Increase the speed of detection and recognition of computer attacks in combined diagonalized neural networks. 2017 4th International Scientific-Practical Conference “Problems of infocommunications. Science and Technolohy”. 152–155 (2017).
dc.relation.referencesen1. KolmogorovA.N. On the representation of continuous functions of several variables by superpositions of continuous functions of a smaller number of variables. DAN USSR. 108, 2 (1956).
dc.relation.referencesen2. KolmogorovA.N. On the representation of continuous functions of several variables by superpositions of continuous functions of one variable and addition. DAN USSR. 114, 953–956 (1957).
dc.relation.referencesen3. VitushkinA.G., KhenkinG.M. Linear superposition of functions. Russ. Math. Surv. 22, 77–125 (1967).
dc.relation.referencesen4. Slupecki J. The criterion of completeness of multivalent accounting system. C. R. Classe III. 32 (1–3), 102–109 (1939).
dc.relation.referencesen5. McCullochW. S., PittsW. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics. 5 (4), 115–133 (1943).
dc.relation.referencesen6. Rosenblatt F. Principles of Neurodynamics: Perceptrons and the theory of brain mechanisms. Spartan Books. Washington, DC (1961).
dc.relation.referencesen7. MinskyM., Papert S. Perceptrons. Oxford, England: M.I.T. Press. (1969).
dc.relation.referencesen8. CybenkoG. Approximations by superpositions of sigmoidal functions. Math. Control Signals Systems. 2, 303–314 (1989).
dc.relation.referencesen9. FunahashiK. On the approximate realization of continuous mappsngs by neural networks. Neural Networks. 2 (3), 183–192 (1989).
dc.relation.referencesen10. HornickK., StinchcombeM., WhiteH. Multilayer feedforward networks are universal approximators. Neural Networks. 2 (5), 359–366 (1989).
dc.relation.referencesen11. Hecht-NielsenR. Kolmogorov’s mapping neural network existence theorem. IEEE First Annual Int. Conf. on Neural Networks, San Diego. 3, 11–13 (1987).
dc.relation.referencesen12. AlekseevD.V. Approximation of functions of several variables by neural networks. Fundamental and applied mathematics. 15 (3), 9–21 (2009).
dc.relation.referencesen13. PenroseR. Shadows of the Mind: A Search for the Missing Science of Consciousness. Oxford University Press, Inc. New York, NY, USA (1994).
dc.relation.referencesen14. LytvynV., VysotskaV., Peleshchak I., Rishnyak I., PeleshchakR. Time dependence of the output signal morphology for nonlinear oscillator neuron based on Van der Pol model. International Journal of Intelligent Systems and Applications. 4, 8–17 (2018).
dc.relation.referencesen15. PeleshchakR.M., LytvynV.V., Peleshchak I.R. Dynamics of a nonlinear oscillatory neuron under the action of an external non-stationary signal. Radioelectronics, computer science, management. 4, 97–105 (2017).
dc.relation.referencesen16. LytvynV., Peleshchak I., PeleshchakR. The compression of the input images in neural network that using method diagonalization the matrices of synaptic weight connections. 2017 2nd International Conference on Advanced Information and Communication Technologies (AICT). 66–70 (2017).
dc.relation.referencesen17. LytvynV., Peleshchak I., PeleshchakR. Increase the speed of detection and recognition of computer attacks in combined diagonalized neural networks. 2017 4th International Scientific-Practical Conference "Problems of infocommunications. Science and Technolohy". 152–155 (2017).
dc.rights.holderCMM IAPMM NAS
dc.rights.holder© 2019 Lviv Polytechnic National University
dc.subjectнейронна мережа
dc.subjectдіагоналізація матриці
dc.subjectоперація агрегування
dc.subjectапроксимація функції
dc.subjectneural network
dc.subjectdiagonalize the matrix
dc.subjectaggregation operation
dc.subjectapproximation of function
dc.subject.udc004.81
dc.subject.udc004.891.2
dc.subject.udc004.891.3
dc.titleHechth–Nielsen theorem for a modified neural network with diagonal synaptic connections
dc.title.alternativeТеорема Хехт–Нільсена для модифікованої нейронної мережі з діагональними синаптичними зв’язками
dc.typeArticle

Files

Original bundle

Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
2019v6n1_Peleshchak_R-Hechth-Nielsen_theorem_101-108.pdf
Size:
927.85 KB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
2019v6n1_Peleshchak_R-Hechth-Nielsen_theorem_101-108__COVER.png
Size:
436.84 KB
Format:
Portable Network Graphics

License bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
3.09 KB
Format:
Plain Text
Description: