Метод синтезу логічних дерев класифікації на підставі селекції елементарних ознак
dc.citation.epage | 32 | |
dc.citation.issue | 2 | |
dc.citation.journalTitle | Український журнал інформаційних технологій | |
dc.citation.spage | 25 | |
dc.citation.volume | 4 | |
dc.contributor.affiliation | Ужгородський національний університет | |
dc.contributor.affiliation | Uzhhorod National University | |
dc.contributor.author | Повхан, І. Ф. | |
dc.contributor.author | Povkhan, I. F. | |
dc.coverage.placename | Львів | |
dc.coverage.placename | Lviv | |
dc.date.accessioned | 2024-03-27T08:56:56Z | |
dc.date.available | 2024-03-27T08:56:56Z | |
dc.date.created | 2022-02-28 | |
dc.date.issued | 2022-02-28 | |
dc.description.abstract | Розглянута загальна задача побудови логічних дерев класифікації та розпізнавання дискретних об'єктів. Об'єктом даного дослідження є логічні дерева класифікації. Предметом дослідження є актуальні методи та алгоритми побудови логічних дерев класифікації. Метою роботи є створення простого та ефективного методу побудови моделей розпізнавання на підставі дерев класифікації для навчальних вибірок дискретної інформації, який характеризується елементарними ознаками в структурі синтезованих логічних дерев класифікації. Запропоновано загальний метод побудови логічних дерев класифікації, який для заданої початкової навчальної вибірки будує деревоподібну структуру, яка складається з набору елементарних ознак, оцінених на кожному кроці побудови моделі за даною вибіркою. Розроблено метод побудови логічного дерева, основна ідея якого полягає в апроксимації начальної вибірки довільного об'єму набором елементарних ознак. Під час формування поточної вершини логічного дерева, його вузол забезпечує виділення найбільш інформативних, якісних елементарних ознак з початкового набору. Такий підхід при побудові остаточного дерева класифікації дає змогу значно скоротити розмір та складність дерева, загальну кількість гілок та ярусів структури, підвищити якість його подальшого аналізу. Запропонований метод побудови логічного дерева класифікації дає змогу будувати деревоподібні моделі розпізнавання для широкого класу задач теорії штучного інтелекту. Розроблений та наведений в роботі метод отримав програмну реалізацію та був досліджений під час розв'язання задачі класифікації даних геологічного типу. Проведені в роботі експерименти підтвердили працездатність запропонованого математичного забезпечення та показують можливість його використання для розв'язання широкого спектру практичних задач розпізнавання та класифікації. Перспективи подальших досліджень полягають в створенні обмеженого методу логічного дерева класифікації, який полягає у введенні критерію зупинки процедури його побудови за глибиною структури, оптимізації його програмних реалізацій, а також проведення експериментального дослідження цього методу на більш широке коло практичних задач. | |
dc.description.abstract | The general problem of constructing logical recognition and classification trees is considered. The object of this study is logical classification trees. The subject of the research is current methods and algorithms for constructing logical classification trees. The aim of the work is to create a simple and effective method for constructing recognition models based on classification trees for training samples of discrete information, which is characterized by elementary features in the structure of synthesized logical classification trees. A general method for constructing logical classification trees is proposed, which builds a tree structure for a given initial training sample, which consists of a set of elementary features evaluated at each step of building a model for this sample. A method for constructing a logical tree is proposed, the main idea of which is to approximate the initial sample of an arbitrary volume with a set of elementary features. When forming the current vertex of the logical tree, the node provides selection of the most informative, qualitative elementary features from the original set. This approach, when constructing the resulting classification tree, can significantly reduce the size and complexity of the tree, the total number of branches and tiers of the structure, and improve the quality of its subsequent analysis. The proposed method for constructing a logical classification tree makes it possible to build tree-like recognition models for a wide class of problems in the theory of artificial intelligence. The method developed and presented in this paper received a software implementation and was investigated when solving the problem of classifying geological data. The experiments carried out in this paper confirmed the operability of the proposed mathematical support and show the possibility of using it to solve a wide range of practical recognition and classification problems. Prospects for further research may consist in creating a limited method of the logical classification tree, which consists in maintaining a criterion for stopping the procedure for constructing a logical tree according to the depth of the structure, optimizing its software implementations, as well as experimental studies of this method for a wider range of practical tasks. | |
dc.format.extent | 25-32 | |
dc.format.pages | 8 | |
dc.identifier.citation | Повхан І. Ф. Метод синтезу логічних дерев класифікації на підставі селекції елементарних ознак / І. Ф. Повхан // Український журнал інформаційних технологій. — Львів : Видавництво Львівської політехніки, 2022. — Том 4. — № 2. — С. 25–32. | |
dc.identifier.citationen | Povkhan I. F. Method for synthesizing logical classification trees based on the selection of elementary features / I. F. Povkhan // Ukrainian Journal of Information Technology. — Lviv : Lviv Politechnic Publishing House, 2022. — Vol 4. — No 2. — P. 25–32. | |
dc.identifier.issn | 2707-1898 | |
dc.identifier.uri | https://ena.lpnu.ua/handle/ntb/61552 | |
dc.language.iso | uk | |
dc.publisher | Видавництво Львівської політехніки | |
dc.publisher | Lviv Politechnic Publishing House | |
dc.relation.ispartof | Український журнал інформаційних технологій, 2 (4), 2022 | |
dc.relation.ispartof | Ukrainian Journal of Information Technology, 2 (4), 2022 | |
dc.relation.references | [1] Bodyanskiy, Y., Vynokurova, O., Setlak, G. & Pliss, I. (2015). Hybrid neuro-neo-fuzzy system and its adaptive learning algorithm, Xth Scien. and Tech. Conf. "Computer Sciences and Information Technologies" (CSIT), Lviv, 111–114. https://doi.org/10.1109/STC-CSIT.2015.7325445 | |
dc.relation.references | [2] Breiman, L. L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and regression trees. Boca Raton, Chapman and Hall/CRC, 368 p. | |
dc.relation.references | [3] De Mántaras, R. L. (1991). A distance-based attribute selection measure for decision tree induction. Machine learning, 6(1), 81–92. https://doi.org/10.1023/A:1022694001379 | |
dc.relation.references | [4] Deng, H., Runger, G., & Tuv, E. (2011). Bias of importance measures for multi-valued attributes and solutions. Proceedings of the 21st International Conference on Artificial Neural Networks (ICANN), 293–300. https://doi.org/10.1007/978-3-642-21738-8_38 | |
dc.relation.references | [5] Hastie, T., Tibshirani, R., & Friedman, J. (2008). The Elements of Statistical Learning. Berlin, Springer, 768 p. https://doi.org/10.1007/978-0-387-84858-7 | |
dc.relation.references | [6] Kamiński, B., Jakubczyk, M., & Szufel, P. (2017). A framework for sensitivity analysis of decision trees. Central European Journal of Operations Research, 26 (1), 135–159. https://doi.org/10.1007/s10100-017-0479-6 | |
dc.relation.references | [7] Karimi, K., & Hamilton, H. J. (2011). Generation and Interpretation of Temporal Decision Rules. International Journal of Computer Information Systems and Industrial Management Applications, 3, 314–323. | |
dc.relation.references | [8] Koskimaki, H., Juutilainen, I., Laurinen, P., & Roning, J. (2008). Two-level clustering approach to training data instance selection: a case study for the steel industry, Neural Networks: International Joint Conference (IJCNN-2008), Hong Kong, 1-8 June 2008: proceedings. Los Alamitos, IEEE, 3044–3049. https://doi.org/10.1109/IJCNN.2008.4634228 | |
dc.relation.references | [9] Kotsiantis, S. B. (2007). Supervised Machine Learning: A Review of Classification Techniques. Informatica, 31, 249–268. | |
dc.relation.references | [10] Laver, V. O., & Povkhan, I. F. (2019). The algorithms for constructing a logical tree of classification in pattern recognition problems. Scientific notes of the Tauride national University. Series: technical Sciences, 30(69)(4), 100–106. https://doi.org/10.32838/2663-5941/2019.4-1/18 | |
dc.relation.references | [11] Miyakawa, M. (1989). Criteria for selecting a variable in the construction of efficient decision trees. IEEE Transactions on Computers, 38(1), 130–141. https://doi.org/10.1109/12.8736 | |
dc.relation.references | [12] Painsky, A., & Rosset, S. (2017). Cross-validated variable selection in tree-based methods improves predictive performance, IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(11), 2142–2153. https://doi.org/10.1109/TPAMI.2016.2636831 | |
dc.relation.references | [13] Povhan, I. (2016). Designing of recognition system of discrete objects, IEEE First International Conference on Data Stream Mining & Processing (DSMP), Ukraine. Lviv, 226–231. | |
dc.relation.references | [14] Povhan, I. (2019). General scheme for constructing the most complex logical tree of classification in pattern recognition discrete objects . Electronics and Information Technologies, 11, 112–117. https://doi.org/10.30970/eli.11.7 | |
dc.relation.references | [15] Povhan, I. F. (2019). The problem of general estimation of the complexity of the maximum constructed logical classification tree. Bulletin of the national technical University Kharkiv Polytechnic Institute, 13, 104−117. https://doi.org/10.20998/2411-0558.2019.13.10 | |
dc.relation.references | [16] Povkhan, I. F. (2018). The problem of functional evaluation of a training sample in discrete object recognition problems. Scientific notes of the Tauride national University. Series: technical Sciences, 29(68)(6), 217–222. | |
dc.relation.references | [17] Povkhan, I. F. (2019). Features of synthesis of generalized features in the construction of recognition systems using the logical tree method, Materials of the international scientific and practical conference "Information technologies and computer modeling ІТКМ-2019". Ivаnо-Frаnkivsk, 169–174. | |
dc.relation.references | [18] Povkhan, I. F. (2019). Features random logic of the classification trees in the pattern recognition problems. Scientific notes of the Tauride national University. Series: technical Sciences, 30(69)(5), 152–161. https://doi.org/10.32838/2663-5941/2019.5-1/22 | |
dc.relation.references | [19] Quinlan, J. R. (1986). Induction of Decision Trees. Machine Learning, 1, 81-106. https://doi.org/10.1007/BF00116251 | |
dc.relation.references | [20] Srikant, R., & Agrawal, R. (1997). Mining generalized association rules. Future Generation Computer Systems, 13(2), 161-180. https://doi.org/10.1016/S0167-739X(97)00019-8 | |
dc.relation.references | [21] Subbotin, S. (2013). The neuro-fuzzy network synthesis and simplification on precedents in problems of diagnosis and pattern recognition. Optical Memory and Neural Networks (Information Optics), 22(2), 97–103. https://doi.org/10.3103/S1060992X13020082 | |
dc.relation.references | [22] Subbotin, S. A. (2013). Methods of sampling based on exhaustive and evolutionary search. Automatic Control and Computer Sciences, 47(30), 113–121. https://doi.org/10.3103/S0146411613030073 | |
dc.relation.references | [23] Subbotin, S. A. (2014). Methods and characteristics of localitypreserving transformations in the problems of computational intelligence. Radio Electronics, Computer Science, Control, (1), 120–128. https://doi.org/10.15588/1607-3274-2014-1-17 | |
dc.relation.references | [24] Subbotin, S. A. (2019). Construction of decision trees for the case of low-information features. Radio Electronics, Computer Science, Control, (1), 121–130. https://doi.org/10.15588/1607-3274-2019-1-12 | |
dc.relation.references | [25] Subbotin, S., & Oliinyk, A. (2017). The dimensionality reduction methods based on computational intelligence in problems of object classification and diagnosis, Recent Advances in Systems, Control and Information Technology, [eds.: R. Szewczyk, M. Kaliczyńska]. Cham, Springer, 11–19. (Advances in Intelligent Systems and Computing, 543. https://doi.org/10.1007/978-3-319-48923-0_2 | |
dc.relation.references | [26] Vasilenko, Y. A., Vashuk, F. G., & Povkhan, I. F. (2011). The problem of estimating the complexity of logical trees recognition and a general method for optimizing them. Scientific and technical journal "European Journal of Enterprise Technologies", 6/4(54), 24–28. | |
dc.relation.references | [27] Vasilenko, Y. A., Vashuk, F. G., & Povkhan, I. F. (2012). General estimation of minimization of tree logical structures. European Journal of Enterprise Technologies, 1/4(55), 29–33. | |
dc.relation.references | [28] Vasilenko, Y. A., Vashuk, F. G., Povkhan, I. F., Kovach, M. Y., & Nikarovich, O. D. (2004). Minimizing logical tree structures in image recognition tasks. European Journal of Enterprise Technologies, 3(9), 12–16. | |
dc.relation.references | [29] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2002). Defining the concept of a feature in pattern recognition theory. Artificial Intelligence, 4, 512–517. | |
dc.relation.references | [30] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2003). Branched feature selection method in mathematical modeling of multi-level image recognition systems. Artificial Intelligence, 7, 246−249. | |
dc.relation.references | [31] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2004). Conceptual basis of image recognition systems based on the branched feature selection method. European Journal of Enterprise Technologies, 7(1), 13–15. | |
dc.relation.references | [32] Vtogoff, P. E. (1989). Incremental Induction of Decision Trees. Machine Learning, (4), 161−186. | |
dc.relation.referencesen | [1] Bodyanskiy, Y., Vynokurova, O., Setlak, G. & Pliss, I. (2015). Hybrid neuro-neo-fuzzy system and its adaptive learning algorithm, Xth Scien. and Tech. Conf. "Computer Sciences and Information Technologies" (CSIT), Lviv, 111–114. https://doi.org/10.1109/STC-CSIT.2015.7325445 | |
dc.relation.referencesen | [2] Breiman, L. L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and regression trees. Boca Raton, Chapman and Hall/CRC, 368 p. | |
dc.relation.referencesen | [3] De Mántaras, R. L. (1991). A distance-based attribute selection measure for decision tree induction. Machine learning, 6(1), 81–92. https://doi.org/10.1023/A:1022694001379 | |
dc.relation.referencesen | [4] Deng, H., Runger, G., & Tuv, E. (2011). Bias of importance measures for multi-valued attributes and solutions. Proceedings of the 21st International Conference on Artificial Neural Networks (ICANN), 293–300. https://doi.org/10.1007/978-3-642-21738-8_38 | |
dc.relation.referencesen | [5] Hastie, T., Tibshirani, R., & Friedman, J. (2008). The Elements of Statistical Learning. Berlin, Springer, 768 p. https://doi.org/10.1007/978-0-387-84858-7 | |
dc.relation.referencesen | [6] Kamiński, B., Jakubczyk, M., & Szufel, P. (2017). A framework for sensitivity analysis of decision trees. Central European Journal of Operations Research, 26 (1), 135–159. https://doi.org/10.1007/s10100-017-0479-6 | |
dc.relation.referencesen | [7] Karimi, K., & Hamilton, H. J. (2011). Generation and Interpretation of Temporal Decision Rules. International Journal of Computer Information Systems and Industrial Management Applications, 3, 314–323. | |
dc.relation.referencesen | [8] Koskimaki, H., Juutilainen, I., Laurinen, P., & Roning, J. (2008). Two-level clustering approach to training data instance selection: a case study for the steel industry, Neural Networks: International Joint Conference (IJCNN-2008), Hong Kong, 1-8 June 2008: proceedings. Los Alamitos, IEEE, 3044–3049. https://doi.org/10.1109/IJCNN.2008.4634228 | |
dc.relation.referencesen | [9] Kotsiantis, S. B. (2007). Supervised Machine Learning: A Review of Classification Techniques. Informatica, 31, 249–268. | |
dc.relation.referencesen | [10] Laver, V. O., & Povkhan, I. F. (2019). The algorithms for constructing a logical tree of classification in pattern recognition problems. Scientific notes of the Tauride national University. Series: technical Sciences, 30(69)(4), 100–106. https://doi.org/10.32838/2663-5941/2019.4-1/18 | |
dc.relation.referencesen | [11] Miyakawa, M. (1989). Criteria for selecting a variable in the construction of efficient decision trees. IEEE Transactions on Computers, 38(1), 130–141. https://doi.org/10.1109/12.8736 | |
dc.relation.referencesen | [12] Painsky, A., & Rosset, S. (2017). Cross-validated variable selection in tree-based methods improves predictive performance, IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(11), 2142–2153. https://doi.org/10.1109/TPAMI.2016.2636831 | |
dc.relation.referencesen | [13] Povhan, I. (2016). Designing of recognition system of discrete objects, IEEE First International Conference on Data Stream Mining & Processing (DSMP), Ukraine. Lviv, 226–231. | |
dc.relation.referencesen | [14] Povhan, I. (2019). General scheme for constructing the most complex logical tree of classification in pattern recognition discrete objects . Electronics and Information Technologies, 11, 112–117. https://doi.org/10.30970/eli.11.7 | |
dc.relation.referencesen | [15] Povhan, I. F. (2019). The problem of general estimation of the complexity of the maximum constructed logical classification tree. Bulletin of the national technical University Kharkiv Polytechnic Institute, 13, 104−117. https://doi.org/10.20998/2411-0558.2019.13.10 | |
dc.relation.referencesen | [16] Povkhan, I. F. (2018). The problem of functional evaluation of a training sample in discrete object recognition problems. Scientific notes of the Tauride national University. Series: technical Sciences, 29(68)(6), 217–222. | |
dc.relation.referencesen | [17] Povkhan, I. F. (2019). Features of synthesis of generalized features in the construction of recognition systems using the logical tree method, Materials of the international scientific and practical conference "Information technologies and computer modeling ITKM-2019". Ivano-Frankivsk, 169–174. | |
dc.relation.referencesen | [18] Povkhan, I. F. (2019). Features random logic of the classification trees in the pattern recognition problems. Scientific notes of the Tauride national University. Series: technical Sciences, 30(69)(5), 152–161. https://doi.org/10.32838/2663-5941/2019.5-1/22 | |
dc.relation.referencesen | [19] Quinlan, J. R. (1986). Induction of Decision Trees. Machine Learning, 1, 81-106. https://doi.org/10.1007/BF00116251 | |
dc.relation.referencesen | [20] Srikant, R., & Agrawal, R. (1997). Mining generalized association rules. Future Generation Computer Systems, 13(2), 161-180. https://doi.org/10.1016/S0167-739X(97)00019-8 | |
dc.relation.referencesen | [21] Subbotin, S. (2013). The neuro-fuzzy network synthesis and simplification on precedents in problems of diagnosis and pattern recognition. Optical Memory and Neural Networks (Information Optics), 22(2), 97–103. https://doi.org/10.3103/S1060992X13020082 | |
dc.relation.referencesen | [22] Subbotin, S. A. (2013). Methods of sampling based on exhaustive and evolutionary search. Automatic Control and Computer Sciences, 47(30), 113–121. https://doi.org/10.3103/S0146411613030073 | |
dc.relation.referencesen | [23] Subbotin, S. A. (2014). Methods and characteristics of localitypreserving transformations in the problems of computational intelligence. Radio Electronics, Computer Science, Control, (1), 120–128. https://doi.org/10.15588/1607-3274-2014-1-17 | |
dc.relation.referencesen | [24] Subbotin, S. A. (2019). Construction of decision trees for the case of low-information features. Radio Electronics, Computer Science, Control, (1), 121–130. https://doi.org/10.15588/1607-3274-2019-1-12 | |
dc.relation.referencesen | [25] Subbotin, S., & Oliinyk, A. (2017). The dimensionality reduction methods based on computational intelligence in problems of object classification and diagnosis, Recent Advances in Systems, Control and Information Technology, [eds., R. Szewczyk, M. Kaliczyńska]. Cham, Springer, 11–19. (Advances in Intelligent Systems and Computing, 543. https://doi.org/10.1007/978-3-319-48923-0_2 | |
dc.relation.referencesen | [26] Vasilenko, Y. A., Vashuk, F. G., & Povkhan, I. F. (2011). The problem of estimating the complexity of logical trees recognition and a general method for optimizing them. Scientific and technical journal "European Journal of Enterprise Technologies", 6/4(54), 24–28. | |
dc.relation.referencesen | [27] Vasilenko, Y. A., Vashuk, F. G., & Povkhan, I. F. (2012). General estimation of minimization of tree logical structures. European Journal of Enterprise Technologies, 1/4(55), 29–33. | |
dc.relation.referencesen | [28] Vasilenko, Y. A., Vashuk, F. G., Povkhan, I. F., Kovach, M. Y., & Nikarovich, O. D. (2004). Minimizing logical tree structures in image recognition tasks. European Journal of Enterprise Technologies, 3(9), 12–16. | |
dc.relation.referencesen | [29] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2002). Defining the concept of a feature in pattern recognition theory. Artificial Intelligence, 4, 512–517. | |
dc.relation.referencesen | [30] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2003). Branched feature selection method in mathematical modeling of multi-level image recognition systems. Artificial Intelligence, 7, 246−249. | |
dc.relation.referencesen | [31] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2004). Conceptual basis of image recognition systems based on the branched feature selection method. European Journal of Enterprise Technologies, 7(1), 13–15. | |
dc.relation.referencesen | [32] Vtogoff, P. E. (1989). Incremental Induction of Decision Trees. Machine Learning, (4), 161−186. | |
dc.relation.uri | https://doi.org/10.1109/STC-CSIT.2015.7325445 | |
dc.relation.uri | https://doi.org/10.1023/A:1022694001379 | |
dc.relation.uri | https://doi.org/10.1007/978-3-642-21738-8_38 | |
dc.relation.uri | https://doi.org/10.1007/978-0-387-84858-7 | |
dc.relation.uri | https://doi.org/10.1007/s10100-017-0479-6 | |
dc.relation.uri | https://doi.org/10.1109/IJCNN.2008.4634228 | |
dc.relation.uri | https://doi.org/10.32838/2663-5941/2019.4-1/18 | |
dc.relation.uri | https://doi.org/10.1109/12.8736 | |
dc.relation.uri | https://doi.org/10.1109/TPAMI.2016.2636831 | |
dc.relation.uri | https://doi.org/10.30970/eli.11.7 | |
dc.relation.uri | https://doi.org/10.20998/2411-0558.2019.13.10 | |
dc.relation.uri | https://doi.org/10.32838/2663-5941/2019.5-1/22 | |
dc.relation.uri | https://doi.org/10.1007/BF00116251 | |
dc.relation.uri | https://doi.org/10.1016/S0167-739X(97)00019-8 | |
dc.relation.uri | https://doi.org/10.3103/S1060992X13020082 | |
dc.relation.uri | https://doi.org/10.3103/S0146411613030073 | |
dc.relation.uri | https://doi.org/10.15588/1607-3274-2014-1-17 | |
dc.relation.uri | https://doi.org/10.15588/1607-3274-2019-1-12 | |
dc.relation.uri | https://doi.org/10.1007/978-3-319-48923-0_2 | |
dc.rights.holder | © Національний університет “Львівська політехніка”, 2022 | |
dc.subject | логічне дерево | |
dc.subject | селекція ознак | |
dc.subject | критерій розгалуження | |
dc.subject | дискретний об'єкт | |
dc.subject | logical tree | |
dc.subject | feature selection | |
dc.subject | branching criterion | |
dc.subject | discrete object | |
dc.title | Метод синтезу логічних дерев класифікації на підставі селекції елементарних ознак | |
dc.title.alternative | Method for synthesizing logical classification trees based on the selection of elementary features | |
dc.type | Article |
Files
License bundle
1 - 1 of 1