Проблема збіжності процедури побудови класифікаторів у схемах логічних і алгоритмічних дерев класифікації

dc.citation.epage36
dc.citation.issue1
dc.citation.journalTitleУкраїнський журнал інформаційних технологій
dc.citation.spage29
dc.citation.volume4
dc.contributor.affiliationУжгородський національний університет
dc.contributor.affiliationUzhhorod National University
dc.contributor.authorПовхан, І. Ф.
dc.contributor.authorPovkhan, I. F.
dc.coverage.placenameЛьвів
dc.coverage.placenameLviv
dc.date.accessioned2024-03-20T09:41:07Z
dc.date.available2024-03-20T09:41:07Z
dc.date.created2022-02-28
dc.date.issued2022-02-28
dc.description.abstractРозглядається проблема збіжності процедури синтезу схем класифікаторів у методах логічних і алгоритмічних дерев класифікації. Запропонована верхня оцінка складності схеми дерева алгоритмів у задачі апроксимації масиву реальних даних набором узагальнених ознак з фіксованим критерієм зупинки процедури розгалуження на етапі побудови дерева класифікації. Даний підхід дає змогу забезпечити необхідну точність моделі, оцінити її складність, знизити кількість розгалужень та досягти необхідних показників ефективності. Вперше для методів побудови структур логічних і алгоритмічних дерев класифікації дана верхня оцінки збіжності побудови дерев класифікації. Запропонована оцінка збіжності процедури побудови класифікаторів для структур ЛДК/АДК дає можливість будувати економні та ефективні моделі класифікації заданої точності. Метод побудови алгоритмічного дерева класифікації базується на поетапній апроксимації начальної вибірки довільного об'єму та структури набором незалежних алгоритмів класифікації. Даний метод при формуванні поточної вершини алгоритмічного дерева, вузла, узагальненої ознаки забезпечує виділення найбільш ефективних, якісних автономних алгоритмів класифікації з початкового набору. Методи синтезу логічних і алгоритмічних дерев класифікації були реалізовані в бібліотеці алгоритмів програмної системи "ОРІОН ІІІ" для розв'язку різноманітних прикладних задач штучного інтелекту. Проведені практичні застосування підтвердили працездатність побудованих моделей дерев класифікації та розробленого програмного забезпечення. В роботі наведена оцінка збіжності процедури побудови схем розпізнавання для випадків логічних і алгоритмічних дерев класифікації в умовах слабкого та сильного розділення класів початкової начальної вибірки.
dc.description.abstractThe problem of convergence of the procedure for synthesizing classifier schemes in the methods of logical and algorithmic classification trees is considered. An upper estimate of the complexity of the algorithm tree scheme is proposed in the problem of approximating an array of real data with a set of generalized features with a fixed criterion for stopping the branching procedure at the stage of constructing a classification tree. This approach allows you to ensure the necessary accuracy of the model, assess its complexity, reduce the number of branches and achieve the necessary performance indicators. For the first time, methods for constructing structures of logical and algorithmic classification trees are given an upper estimate of the convergence of constructing classification trees. The proposed convergence estimate of the procedure for constructing classifiers for LCT/ACT structures makes it possible to build economical and efficient classification models of a given accuracy. The method of constructing an algorithmic classification tree is based on a step-by-step approximation of an initial sample of arbitrary volume and structure by a set of independent classification algorithms. When forming the current vertex of an algorithmic tree, node, or generalized feature, this method highlights the most efficient, high-quality autonomous classification algorithms from the initial set. This approach to constructing the resulting classification tree can significantly reduce the size and complexity of the tree, the total number of branches, vertices, and tiers of the structure, improve the quality of its subsequent analysis, interpretability, and ability to decompose. Methods for synthesizing logical and algorithmic classification trees were implemented in the library of algorithms of the “Orion III” software system for solving various applied problems of artificial intelligence. Practical applications have confirmed the operability of the constructed classification tree models and the developed software. The paper estimates the convergence of the procedure for constructing recognition schemes for cases of logical and algorithmic classification trees under conditions of weak and strong class separation of the initial sample. Prospects for further research and testing may consist in evaluating the convergence of the ACT synthesis procedure in a limited method of the algorithmic classification tree, which consists in maintaining a criterion for stopping the procedure for constructing a tree model by the depth of the structure, optimizing its software implementations, introducing new types of algorithmic trees, as well as experimental studies of this method for a wider range of practical problems.
dc.format.extent29-36
dc.format.pages8
dc.identifier.citationПовхан І. Ф. Проблема збіжності процедури побудови класифікаторів у схемах логічних і алгоритмічних дерев класифікації / І. Ф. Повхан // Український журнал інформаційних технологій. — Львів : Видавництво Львівської політехніки, 2022. — Том 4. — № 1. — С. 29–36.
dc.identifier.citationenPovkhan I. F. Convergence problem schemes for constructing structures of logical and algorithmic classification trees / I. F. Povkhan // Ukrainian Journal of Information Technology. — Lviv : Lviv Politechnic Publishing House, 2022. — Vol 4. — No 1. — P. 29–36.
dc.identifier.doidoi.org/10.23939/ujit2022.01.029
dc.identifier.issn2707-1898
dc.identifier.urihttps://ena.lpnu.ua/handle/ntb/61519
dc.language.isouk
dc.publisherВидавництво Львівської політехніки
dc.publisherLviv Politechnic Publishing House
dc.relation.ispartofУкраїнський журнал інформаційних технологій, 1 (4), 2022
dc.relation.ispartofUkrainian Journal of Information Technology, 1 (4), 2022
dc.relation.references[1] Hastie, T., Tibshirani, R., & Friedman, J. (2008). The Elements of Statistical Learning. Berlin, Springer. https://doi.org/10.1007/978-0-387-84858-7
dc.relation.references[2] Quinlan, J. R. (1986). Induction of Decision Trees, Machine Learning, 1, 81–106. https://doi.org/10.1007/BF00116251
dc.relation.references[3] Breiman, L. L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and regression trees. Boca Raton, Chapman and Hall/CRC.
dc.relation.references[4] Lupei, M., Mitsa, A., Repariuk, V., & Sharkan, V. (2020). Identification of authorship of Ukrainian-language texts of journalistic style using neural networks. Eastern-European Journal of Enterprise Technologies, 1-2(103), 30–36. https://doi.org/10.15587/1729-4061.2020.195041
dc.relation.references[5] Subbotin, S. A., & Oliinyk, A. A. (2017). The Dimensionality Reduction Methods Based on Computational Intelligence in Problems of Object Classification and Diagnosis. Szewczyk, R., Kaliczyńska, M. (eds) Recent Advances in Systems, Control and Information Technology. SCIT 2016. Advances in Intelligent Systems and Computing, vol 543, 11–19. Springer, Cham. https://doi.org/10.1007/978-3-319-48923-0_2
dc.relation.references[6] Miyakawa, M. (1989). Criteria for selecting a variable in the construction of efficient decision trees, IEEE Transactions on Computers, 38(1), 130–141. https://doi.org/10.1109/12.8736
dc.relation.references[7] Koskimaki, H., Juutilainen, I., Laurinen, P., & Roning, J. Two-level clustering approach to training data instance selection: a case study for the steel industry, Neural Networks: International Joint Conference (IJCNN-2008), Hong Kong, 1–8 June 2008: proceedings. Los Alamitos, IEEE, 2008, 3044–3049. https://doi.org/10.1109/IJCNN.2008.4634228
dc.relation.references[8] Subbotin, S. (2013). The neuro-fuzzy network synthesis and simplification on precedents in problems of diagnosis and pattern recognition, Optical Memory and Neural Networks, 22(2), 97–103. https://doi.org/10.3103/S1060992X13020082
dc.relation.references[9] Subbotin, S. A. (2013). Methods of sampling based on exhaustive and evolutionary search, Automatic Control and Computer Sciences, 47(3), 113–121. https://doi.org/10.3103/S0146411613030073
dc.relation.references[10] De Mántaras, R. L. (1991). A distance-based attribute selection measure for decision tree induction, Machine learning, 6(1), 81–92. https://doi.org/10.1023/A:1022694001379
dc.relation.references[11] Karimi, K., & Hamilton, H.J. (2011). Generation and Interpretation of Temporal Decision Rules, International Journal of Computer Information Systems and Industrial Management Applications, 3, 314–323.
dc.relation.references[12] Kamiński, B., Jakubczyk, M., & Szufel, P. (2017). A framework for sensitivity analysis of decision trees, Central European Journal of Operations Research, 26(1), 135–159. https://doi.org/10.1007/s10100-017-0479-6
dc.relation.references[13] Deng, H., Runger, G., & Tuv, E. (2011). Bias of importance measures for multi-valued attributes and solutions, Proceedings of the 21st International Conference on Artificial Neural Networks (ICANN), 293–300. https://doi.org/10.1007/978-3-642-21738-8_38
dc.relation.references[14] Subbotin, S. A. (2019). Construction of decision trees for the case of low-information features, Radio Electronics, Computer Science, Control, 1, 121–130. https://doi.org/10.15588/1607-3274-2019-1-12
dc.relation.references[15] Deng, H., Runger, G., & Tuv, E. (2011). Bias of importance measures for multi-valued attributes and solutions, 21st International Conference on Artificial Neural Networks (ICANN), Espoo, 14–17 June 2011: proceedings. Berlin, Springer-Verlag, 2, 293–300. https://doi.org/10.1007/978-3-642-21738-8_38
dc.relation.references[16] Painsky, A., & Rosset, S. (2017). Cross-validated variable selection in tree-based methods improves predictive performance, IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(11), 2142–2153. https://doi.org/10.1109/TPAMI.2016.2636831
dc.relation.references[17] Subbotin, S. A. (2014). Methods and characteristics of locality preserving transformations in the problems of computational intelligence, Radio Electronics, Computer Science, Control, 1, 120–128. https://doi.org/10.15588/1607-3274-2014-1-17
dc.relation.references[18] Kotsiantis, S. B. (2007). Supervised Machine Learning: A Review of Classification Techniques, Informatica, 31, 249–268.
dc.relation.references[19] Zhuravlev, Yu. I., & Nikiforov, V. V. (1971). Recognition algorithms based on the calculation of estimates, Cybernetics, 3, 1–11.
dc.relation.references[20] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2003). Branched feature selection method in mathematical modeling of multi-level image recognition systems, Artificial Intelligence, 7, 246−249.
dc.relation.references[21] Povkhan, I. (2020). A constrained method of constructing the logic classification trees on the basis of elementary attribute selection, CEUR Workshop Proceedings: Proceedings of the Second International Workshop on Computer Modeling and Intelligent Systems (CMIS-2020), Zaporizhzhia, Ukraine, April 15–19, 2020. Zaporizhzhia, 2608, 843–857. https://doi.org/10.32782/cmis/2608-63
dc.relation.references[22] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2004). Conceptual basis of image recognition systems based on the branched feature selection method, European Journal of Enterprise Technologies, 7(1), 13–15.
dc.relation.references[23] Povkhan, I., & Lupei, M. (2020). The algorithmic classification trees. Proceedings of the "2020 IEEE Third International Conference on Data Stream Mining & Processing (DSMP)", August 21–25, Lviv, Ukraine, 37–44. https://doi.org/10.1109/DSMP47368.2020.9204198
dc.relation.references[24] Povkhan, I., Lupei, M., Kliap, M., & Laver, V. (2020). The issue of efficient generation of generalized features in algorithmic classification tree methods. International Conference on Data Stream Mining and Processing: DSMP Data Stream Mining & Processing, Springer, Cham, 98–113. https://doi.org/10.1007/978-3-030-61656-4_6
dc.relation.references[25] Povkhan, I. (2020). Classification models of flood-related events based on algorithmic trees. Eastern-European Journal of Enterprise Technologies, 6(4), 58–68. https://doi.org/10.15587/1729-4061.2020.219525
dc.relation.references[26] Rabcan, J., Levashenko, V., Zaitseva, E., Kvassay, M., & Subbotin, S. (2019). Application of Fuzzy Decision Tree for Signal Classification. IEEE Transactions on Industrial Informatics, 15(10), 5425–5434. https://doi.org/10.1109/TII.2019.2904845
dc.relation.references[27] Utgoff, P. E. (1989). Incremental induction of decision trees. Machine learning, 4(2), 161–186. https://doi.org/10.1023/A:1022699900025
dc.relation.references[28] Hyafil, L., & Rivest, R. L. (1976). Constructing optimal binary decision trees is npcomplete. Information Processing Letters, 5(1), 15–17. https://doi.org/10.1016/0020-0190(76)90095-8
dc.relation.references[29] Wang, H., & Hong, M. (2019). Online ad effectiveness evaluation with a two-stage method using a Gaussian filter and decision tree approach. Electronic Commerce Research and Applications. 35, Article 100852. https://doi.org/10.1016/j.elerap.2019.100852
dc.relation.references[30] Kaftannikov, I. L., & Parasich, A. V. (2015). Decision Trees Features of Application in Classification Problems. Bulletin of the South Ural State University. Ser. Computer Technologies, Automatic Control, Radio Electronics, 15(3), 26–32. https://doi.org/10.14529/ctcr150304
dc.relation.references[31] Povhan, I. F. (2020). Logical recognition tree construction on the basis a step-to-step elementary attribute selection. Radio Electronics, Computer Science, Control, 2, 95–106. https://doi.org/10.15588/1607-3274-2020-2-10
dc.relation.references[32] Bodyanskiy, Y., Vynokurova, O., Setlak, G., & Pliss, I. (2015). Hybrid neuro-neo-fuzzy system and its adaptive learning algorithm, Xth Scien. and Tech. Conf. "Computer Sciences and Information Technologies" (CSIT), 111–114. https://doi.org/10.1109/STC-CSIT.2015.7325445
dc.relation.references[33] Srikant, R., Agrawal, R. (1997). Mining generalized association rules, Future Generation Computer Systems, 13(2), 161–180. https://doi.org/10.1016/S0167-739X(97)00019-8
dc.relation.references[34] Vasilenko, Y. A., & Vashuk, F. G. (2012). General estimation of minimization of tree logical structures, European Journal of Enterprise Technologies, 1/4(55), 29–33.
dc.relation.references[35] Kushneryk, P., Kondratenko, Y., & Sidenko, I. (2019). Intelligent dialogue system based on deep learning technology. 15th International Conference on ICT in Education, Research, and Industrial Applications: PhD Symposium (ICTERI 2019: PhD Symposium), Kherson, Ukraine, 2403, 53–62.
dc.relation.references[36] Kotsovsky, V., Geche, F., & Batyuk, A. (2018). Finite generalization of the offline spectral learning. Proceedings of the 2018 IEEE Second International Conference on Data Stream Mining & Processing (DSMP), Lviv, Ukraine August 21–25, 356–360. https://doi.org/10.1109/DSMP.2018.8478584
dc.relation.referencesen[1] Hastie, T., Tibshirani, R., & Friedman, J. (2008). The Elements of Statistical Learning. Berlin, Springer. https://doi.org/10.1007/978-0-387-84858-7
dc.relation.referencesen[2] Quinlan, J. R. (1986). Induction of Decision Trees, Machine Learning, 1, 81–106. https://doi.org/10.1007/BF00116251
dc.relation.referencesen[3] Breiman, L. L., Friedman, J. H., Olshen, R. A., & Stone, C. J. (1984). Classification and regression trees. Boca Raton, Chapman and Hall/CRC.
dc.relation.referencesen[4] Lupei, M., Mitsa, A., Repariuk, V., & Sharkan, V. (2020). Identification of authorship of Ukrainian-language texts of journalistic style using neural networks. Eastern-European Journal of Enterprise Technologies, 1-2(103), 30–36. https://doi.org/10.15587/1729-4061.2020.195041
dc.relation.referencesen[5] Subbotin, S. A., & Oliinyk, A. A. (2017). The Dimensionality Reduction Methods Based on Computational Intelligence in Problems of Object Classification and Diagnosis. Szewczyk, R., Kaliczyńska, M. (eds) Recent Advances in Systems, Control and Information Technology. SCIT 2016. Advances in Intelligent Systems and Computing, vol 543, 11–19. Springer, Cham. https://doi.org/10.1007/978-3-319-48923-0_2
dc.relation.referencesen[6] Miyakawa, M. (1989). Criteria for selecting a variable in the construction of efficient decision trees, IEEE Transactions on Computers, 38(1), 130–141. https://doi.org/10.1109/12.8736
dc.relation.referencesen[7] Koskimaki, H., Juutilainen, I., Laurinen, P., & Roning, J. Two-level clustering approach to training data instance selection: a case study for the steel industry, Neural Networks: International Joint Conference (IJCNN-2008), Hong Kong, 1–8 June 2008: proceedings. Los Alamitos, IEEE, 2008, 3044–3049. https://doi.org/10.1109/IJCNN.2008.4634228
dc.relation.referencesen[8] Subbotin, S. (2013). The neuro-fuzzy network synthesis and simplification on precedents in problems of diagnosis and pattern recognition, Optical Memory and Neural Networks, 22(2), 97–103. https://doi.org/10.3103/S1060992X13020082
dc.relation.referencesen[9] Subbotin, S. A. (2013). Methods of sampling based on exhaustive and evolutionary search, Automatic Control and Computer Sciences, 47(3), 113–121. https://doi.org/10.3103/S0146411613030073
dc.relation.referencesen[10] De Mántaras, R. L. (1991). A distance-based attribute selection measure for decision tree induction, Machine learning, 6(1), 81–92. https://doi.org/10.1023/A:1022694001379
dc.relation.referencesen[11] Karimi, K., & Hamilton, H.J. (2011). Generation and Interpretation of Temporal Decision Rules, International Journal of Computer Information Systems and Industrial Management Applications, 3, 314–323.
dc.relation.referencesen[12] Kamiński, B., Jakubczyk, M., & Szufel, P. (2017). A framework for sensitivity analysis of decision trees, Central European Journal of Operations Research, 26(1), 135–159. https://doi.org/10.1007/s10100-017-0479-6
dc.relation.referencesen[13] Deng, H., Runger, G., & Tuv, E. (2011). Bias of importance measures for multi-valued attributes and solutions, Proceedings of the 21st International Conference on Artificial Neural Networks (ICANN), 293–300. https://doi.org/10.1007/978-3-642-21738-8_38
dc.relation.referencesen[14] Subbotin, S. A. (2019). Construction of decision trees for the case of low-information features, Radio Electronics, Computer Science, Control, 1, 121–130. https://doi.org/10.15588/1607-3274-2019-1-12
dc.relation.referencesen[15] Deng, H., Runger, G., & Tuv, E. (2011). Bias of importance measures for multi-valued attributes and solutions, 21st International Conference on Artificial Neural Networks (ICANN), Espoo, 14–17 June 2011: proceedings. Berlin, Springer-Verlag, 2, 293–300. https://doi.org/10.1007/978-3-642-21738-8_38
dc.relation.referencesen[16] Painsky, A., & Rosset, S. (2017). Cross-validated variable selection in tree-based methods improves predictive performance, IEEE Transactions on Pattern Analysis and Machine Intelligence, 39(11), 2142–2153. https://doi.org/10.1109/TPAMI.2016.2636831
dc.relation.referencesen[17] Subbotin, S. A. (2014). Methods and characteristics of locality preserving transformations in the problems of computational intelligence, Radio Electronics, Computer Science, Control, 1, 120–128. https://doi.org/10.15588/1607-3274-2014-1-17
dc.relation.referencesen[18] Kotsiantis, S. B. (2007). Supervised Machine Learning: A Review of Classification Techniques, Informatica, 31, 249–268.
dc.relation.referencesen[19] Zhuravlev, Yu. I., & Nikiforov, V. V. (1971). Recognition algorithms based on the calculation of estimates, Cybernetics, 3, 1–11.
dc.relation.referencesen[20] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2003). Branched feature selection method in mathematical modeling of multi-level image recognition systems, Artificial Intelligence, 7, 246−249.
dc.relation.referencesen[21] Povkhan, I. (2020). A constrained method of constructing the logic classification trees on the basis of elementary attribute selection, CEUR Workshop Proceedings: Proceedings of the Second International Workshop on Computer Modeling and Intelligent Systems (CMIS-2020), Zaporizhzhia, Ukraine, April 15–19, 2020. Zaporizhzhia, 2608, 843–857. https://doi.org/10.32782/cmis/2608-63
dc.relation.referencesen[22] Vasilenko, Y. A., Vasilenko, E. Y., & Povkhan, I. F. (2004). Conceptual basis of image recognition systems based on the branched feature selection method, European Journal of Enterprise Technologies, 7(1), 13–15.
dc.relation.referencesen[23] Povkhan, I., & Lupei, M. (2020). The algorithmic classification trees. Proceedings of the "2020 IEEE Third International Conference on Data Stream Mining & Processing (DSMP)", August 21–25, Lviv, Ukraine, 37–44. https://doi.org/10.1109/DSMP47368.2020.9204198
dc.relation.referencesen[24] Povkhan, I., Lupei, M., Kliap, M., & Laver, V. (2020). The issue of efficient generation of generalized features in algorithmic classification tree methods. International Conference on Data Stream Mining and Processing: DSMP Data Stream Mining & Processing, Springer, Cham, 98–113. https://doi.org/10.1007/978-3-030-61656-4_6
dc.relation.referencesen[25] Povkhan, I. (2020). Classification models of flood-related events based on algorithmic trees. Eastern-European Journal of Enterprise Technologies, 6(4), 58–68. https://doi.org/10.15587/1729-4061.2020.219525
dc.relation.referencesen[26] Rabcan, J., Levashenko, V., Zaitseva, E., Kvassay, M., & Subbotin, S. (2019). Application of Fuzzy Decision Tree for Signal Classification. IEEE Transactions on Industrial Informatics, 15(10), 5425–5434. https://doi.org/10.1109/TII.2019.2904845
dc.relation.referencesen[27] Utgoff, P. E. (1989). Incremental induction of decision trees. Machine learning, 4(2), 161–186. https://doi.org/10.1023/A:1022699900025
dc.relation.referencesen[28] Hyafil, L., & Rivest, R. L. (1976). Constructing optimal binary decision trees is npcomplete. Information Processing Letters, 5(1), 15–17. https://doi.org/10.1016/0020-0190(76)90095-8
dc.relation.referencesen[29] Wang, H., & Hong, M. (2019). Online ad effectiveness evaluation with a two-stage method using a Gaussian filter and decision tree approach. Electronic Commerce Research and Applications. 35, Article 100852. https://doi.org/10.1016/j.elerap.2019.100852
dc.relation.referencesen[30] Kaftannikov, I. L., & Parasich, A. V. (2015). Decision Trees Features of Application in Classification Problems. Bulletin of the South Ural State University. Ser. Computer Technologies, Automatic Control, Radio Electronics, 15(3), 26–32. https://doi.org/10.14529/ctcr150304
dc.relation.referencesen[31] Povhan, I. F. (2020). Logical recognition tree construction on the basis a step-to-step elementary attribute selection. Radio Electronics, Computer Science, Control, 2, 95–106. https://doi.org/10.15588/1607-3274-2020-2-10
dc.relation.referencesen[32] Bodyanskiy, Y., Vynokurova, O., Setlak, G., & Pliss, I. (2015). Hybrid neuro-neo-fuzzy system and its adaptive learning algorithm, Xth Scien. and Tech. Conf. "Computer Sciences and Information Technologies" (CSIT), 111–114. https://doi.org/10.1109/STC-CSIT.2015.7325445
dc.relation.referencesen[33] Srikant, R., Agrawal, R. (1997). Mining generalized association rules, Future Generation Computer Systems, 13(2), 161–180. https://doi.org/10.1016/S0167-739X(97)00019-8
dc.relation.referencesen[34] Vasilenko, Y. A., & Vashuk, F. G. (2012). General estimation of minimization of tree logical structures, European Journal of Enterprise Technologies, 1/4(55), 29–33.
dc.relation.referencesen[35] Kushneryk, P., Kondratenko, Y., & Sidenko, I. (2019). Intelligent dialogue system based on deep learning technology. 15th International Conference on ICT in Education, Research, and Industrial Applications: PhD Symposium (ICTERI 2019: PhD Symposium), Kherson, Ukraine, 2403, 53–62.
dc.relation.referencesen[36] Kotsovsky, V., Geche, F., & Batyuk, A. (2018). Finite generalization of the offline spectral learning. Proceedings of the 2018 IEEE Second International Conference on Data Stream Mining & Processing (DSMP), Lviv, Ukraine August 21–25, 356–360. https://doi.org/10.1109/DSMP.2018.8478584
dc.relation.urihttps://doi.org/10.1007/978-0-387-84858-7
dc.relation.urihttps://doi.org/10.1007/BF00116251
dc.relation.urihttps://doi.org/10.15587/1729-4061.2020.195041
dc.relation.urihttps://doi.org/10.1007/978-3-319-48923-0_2
dc.relation.urihttps://doi.org/10.1109/12.8736
dc.relation.urihttps://doi.org/10.1109/IJCNN.2008.4634228
dc.relation.urihttps://doi.org/10.3103/S1060992X13020082
dc.relation.urihttps://doi.org/10.3103/S0146411613030073
dc.relation.urihttps://doi.org/10.1023/A:1022694001379
dc.relation.urihttps://doi.org/10.1007/s10100-017-0479-6
dc.relation.urihttps://doi.org/10.1007/978-3-642-21738-8_38
dc.relation.urihttps://doi.org/10.15588/1607-3274-2019-1-12
dc.relation.urihttps://doi.org/10.1109/TPAMI.2016.2636831
dc.relation.urihttps://doi.org/10.15588/1607-3274-2014-1-17
dc.relation.urihttps://doi.org/10.32782/cmis/2608-63
dc.relation.urihttps://doi.org/10.1109/DSMP47368.2020.9204198
dc.relation.urihttps://doi.org/10.1007/978-3-030-61656-4_6
dc.relation.urihttps://doi.org/10.15587/1729-4061.2020.219525
dc.relation.urihttps://doi.org/10.1109/TII.2019.2904845
dc.relation.urihttps://doi.org/10.1023/A:1022699900025
dc.relation.urihttps://doi.org/10.1016/0020-0190(76)90095-8
dc.relation.urihttps://doi.org/10.1016/j.elerap.2019.100852
dc.relation.urihttps://doi.org/10.14529/ctcr150304
dc.relation.urihttps://doi.org/10.15588/1607-3274-2020-2-10
dc.relation.urihttps://doi.org/10.1109/STC-CSIT.2015.7325445
dc.relation.urihttps://doi.org/10.1016/S0167-739X(97)00019-8
dc.relation.urihttps://doi.org/10.1109/DSMP.2018.8478584
dc.rights.holder© Національний університет “Львівська політехніка”, 2022
dc.subjectлогічне дерево
dc.subjectалгоритмічне дерево
dc.subjectкласифікатор
dc.subjectрозпізнавання образів
dc.subjectознака
dc.subjectначальна вибірка
dc.subjectlogical tree
dc.subjectalgorithmic tree
dc.subjectclassifier
dc.subjectpattern recognition
dc.subjectattribute
dc.subjecttraining sample
dc.titleПроблема збіжності процедури побудови класифікаторів у схемах логічних і алгоритмічних дерев класифікації
dc.title.alternativeConvergence problem schemes for constructing structures of logical and algorithmic classification trees
dc.typeArticle

Files

Original bundle

Now showing 1 - 2 of 2
Thumbnail Image
Name:
2022v4n1_Povkhan_I_F-Convergence_problem_schemes_29-36.pdf
Size:
6.26 MB
Format:
Adobe Portable Document Format
Thumbnail Image
Name:
2022v4n1_Povkhan_I_F-Convergence_problem_schemes_29-36__COVER.png
Size:
1.86 MB
Format:
Portable Network Graphics

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.76 KB
Format:
Plain Text
Description: