Diabetes prediction using an improved machine learning approach
dc.citation.epage | 735 | |
dc.citation.issue | 4 | |
dc.citation.spage | 726 | |
dc.contributor.affiliation | Університет Султана Мулая Слімана | |
dc.contributor.affiliation | University Sultan Moulay Slimane | |
dc.contributor.author | Лякіні, С. | |
dc.contributor.author | Нахауі, М. | |
dc.contributor.author | Lyaqini, S. | |
dc.contributor.author | Nachaoui, M. | |
dc.coverage.placename | Львів | |
dc.coverage.placename | Lviv | |
dc.date.accessioned | 2023-11-01T07:49:24Z | |
dc.date.available | 2023-11-01T07:49:24Z | |
dc.date.created | 2021-03-01 | |
dc.date.issued | 2021-03-01 | |
dc.description.abstract | У статті розглядається модель машинного навчання, що походить з області охорони здоров’я, а саме: прогресування діабету. Модель переформульовується в регуляризовану задачу оптимізації. Член правдоподібності — це норма L1, а оптимізаційний простір міінімуму побудований за допомогою відтворюючого ядра гільбертового простору (ВЯГП). Чисельне наближення моделі реалізується методом Адама, який є успішним у чисельних експериментах (порівняно з алгоритмом стохастичного градієнтного спуску (СГС)). | |
dc.description.abstract | This paper deals with a machine-learning model arising from the healthcare sector, namely diabetes progression. The model is reformulated into a regularized optimization problem. The term of the fidelity is the L1 norm and the optimization space of the minimum is constructed by a reproducing kernel Hilbert space (RKSH). The numerical approximation of the model is realized by the Adam method, which shows its success in the numerical experiments (if compared to the stochastic gradient descent (SGD) algorithm). | |
dc.format.extent | 726-735 | |
dc.format.pages | 10 | |
dc.identifier.citation | Lyaqini S. Diabetes prediction using an improved machine learning approach / S. Lyaqini, M. Nachaoui // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2021. — Vol 8. — No 4. — P. 726–735. | |
dc.identifier.citationen | Lyaqini S. Diabetes prediction using an improved machine learning approach / S. Lyaqini, M. Nachaoui // Mathematical Modeling and Computing. — Lviv : Lviv Politechnic Publishing House, 2021. — Vol 8. — No 4. — P. 726–735. | |
dc.identifier.doi | 10.23939/mmc2021.04.726 | |
dc.identifier.uri | https://ena.lpnu.ua/handle/ntb/60437 | |
dc.language.iso | en | |
dc.publisher | Видавництво Львівської політехніки | |
dc.publisher | Lviv Politechnic Publishing House | |
dc.relation.ispartof | Mathematical Modeling and Computing, 4 (8), 2021 | |
dc.relation.references | [1] Ricci P., Bloti`ere P. O., Weill A., Simon D., Tuppin P., Ricordeau P., Allemand H. Diab`ete trait´e: quelles´evolutions entre 2000 et 2009 en France. Bull. Epidemiol. Hebd. 42 (42–43), 425–431 (2010). | |
dc.relation.references | [2] Isnard R., Legrand L., Pousset F. Insuffisance cardiaque et diab`ete: donn´ees ´epid´emiologiques, ph´enotype et impact sur le pronostic. M´edecine des Maladies M´etaboliques. 15 (3), 246–251 (2021). | |
dc.relation.references | [3] Kavakiotis I., Tsave O., Salifoglou A., Maglaveras N., Vlahavas I., Chouvarda I. Machine learning and data mining methods in diabetes research. Computational and Structural Biotechnology Journal. 15, 104–116 (2017). | |
dc.relation.references | [4] Perveen S., Shahbaz M., Keshavjee K., Guergachi A. Metabolic syndrome and development of diabetes mellitus: Predictive modeling based on machine learning techniques. IEEE Access. 7, 1365–1375 (2018). | |
dc.relation.references | [5] Luo G. Automatically explaining machine learning prediction results: a demonstration on type 2 diabetes risk prediction. Health Information Science and Systems. 4 (1), Article number: 2 (2016). | |
dc.relation.references | [6] Benhamou P. Y., Lablanche S. Diab`ete de type 1: perspectives technologiques. Mise Au Point. 11–16 (2018). | |
dc.relation.references | [7] Hofmann T., Sch¨olkopf B., Smola A. J. Kernel methods in machine learning. The Annals of Statistics. 36 (3), 1171–1220 (2008). | |
dc.relation.references | [8] Aronszajn N. Theory of Reproducing Kernels. Transactions of the American Mathematical Society. 68, 337–404 (1950). | |
dc.relation.references | [9] Rosasco L., De Vito E., Caponnetto A., Piana M., Verri A. Are Loss Functions All the Same? Neural Computation. 16 (5), 1063–1076 (2004). | |
dc.relation.references | [10] Lyaqini S., Quafafou M., Nachaoui M., Chakib A. Supervised learning as an inverse problem based on non-smooth loss function. Knowledge and Information Systems. 62, 3039–3058 (2020). | |
dc.relation.references | [11] Lyaqini S., Nachaoui M., Quafafou M. Non-smooth classification model based on new smoothing technique. Journal of Physics: Conference Series. 1743, 012025 (2021). | |
dc.relation.references | [12] Nachaoui M. Parameter learning for combined first and second order total variation for image reconstruction. Advanced Mathematical Models & Applications. 5 (1), 53–69 (2020). | |
dc.relation.references | [13] El Mourabit I., El Rhabi M., Hakim A., Laghrib A., Moreau E. A new denoising model for multi-frame super-resolution image reconstruction. Signal Processing. 132, 51–65 (2017). | |
dc.relation.references | [14] Chen C., Mangasarian O. L. A class of smoothing functions for nonlinear and mixed complementarity problems. Computational Optimization and Applications. 5 (2), 97–138 (1996). | |
dc.relation.references | [15] Lee Y. J., Hsieh W. F., Huang C. M. “/spl epsi/-SSVR: a smooth support vector machine for epsiloninsensitive regression. IEEE Transactions on Knowledge & Data Engineering. 17 (5), 678–685 (2005). | |
dc.relation.references | [16] Hajewski J., Oliveira S., Stewart D. Smoothed Hinge Loss and ℓ 1 Support Vector Machines. 2018 IEEE International Conference on Data Mining Workshops (ICDMW). 1217–1223 (2018). | |
dc.relation.references | [17] D´efossez A., Bottou L., Bach F., Usunier N. On the convergence of Adam and Adagrad. arXiv preprint arXiv:2003.02395 (2020). | |
dc.relation.references | [18] Fei Z., Wu Z., Xiao Y., Ma J., He W. A new short-arc fitting method with high precision using Adam optimization algorithm. Optik. 212, 164788 (2020). | |
dc.relation.references | [19] Rosales R., Schmidt M., Fung G. Fast Optimization Methods for L1 Regularization: A Comparative Study and Two New Approaches (2007). | |
dc.relation.references | [20] Hadamard J. Lectures on Cauchy’s problem in linear partial differential equations. New Haven, Yale University Press (1923). | |
dc.relation.references | [21] Girosi F., Jones M., Poggio T. Regularization theory and neural networks architectures. Neural computation. 7 (2), 219–269 (1995). | |
dc.relation.references | [22] Sch¨olkopf B., Herbrich R., Smola A. J. A generalized representer theorem. International conference on computational learning theory. 416–426 (2001). | |
dc.relation.references | [23] Boyd S., Vandenberghe L. Convex Optimization. Cambridge University Press, New York, USA (2004). | |
dc.relation.references | [24] Ruder S. An overview of gradient descent optimization algorithms. arXiv preprint arXiv:1609.04747 (2016). | |
dc.relation.references | [25] Efron B., Hastie T., Johnstone I., Tibshirani R. Least angle regression. Annals of statistics. 32 (2), 407–499 (2004). | |
dc.relation.referencesen | [1] Ricci P., Bloti`ere P. O., Weill A., Simon D., Tuppin P., Ricordeau P., Allemand H. Diab`ete trait´e: quelles´evolutions entre 2000 et 2009 en France. Bull. Epidemiol. Hebd. 42 (42–43), 425–431 (2010). | |
dc.relation.referencesen | [2] Isnard R., Legrand L., Pousset F. Insuffisance cardiaque et diab`ete: donn´ees ´epid´emiologiques, ph´enotype et impact sur le pronostic. M´edecine des Maladies M´etaboliques. 15 (3), 246–251 (2021). | |
dc.relation.referencesen | [3] Kavakiotis I., Tsave O., Salifoglou A., Maglaveras N., Vlahavas I., Chouvarda I. Machine learning and data mining methods in diabetes research. Computational and Structural Biotechnology Journal. 15, 104–116 (2017). | |
dc.relation.referencesen | [4] Perveen S., Shahbaz M., Keshavjee K., Guergachi A. Metabolic syndrome and development of diabetes mellitus: Predictive modeling based on machine learning techniques. IEEE Access. 7, 1365–1375 (2018). | |
dc.relation.referencesen | [5] Luo G. Automatically explaining machine learning prediction results: a demonstration on type 2 diabetes risk prediction. Health Information Science and Systems. 4 (1), Article number: 2 (2016). | |
dc.relation.referencesen | [6] Benhamou P. Y., Lablanche S. Diab`ete de type 1: perspectives technologiques. Mise Au Point. 11–16 (2018). | |
dc.relation.referencesen | [7] Hofmann T., Sch¨olkopf B., Smola A. J. Kernel methods in machine learning. The Annals of Statistics. 36 (3), 1171–1220 (2008). | |
dc.relation.referencesen | [8] Aronszajn N. Theory of Reproducing Kernels. Transactions of the American Mathematical Society. 68, 337–404 (1950). | |
dc.relation.referencesen | [9] Rosasco L., De Vito E., Caponnetto A., Piana M., Verri A. Are Loss Functions All the Same? Neural Computation. 16 (5), 1063–1076 (2004). | |
dc.relation.referencesen | [10] Lyaqini S., Quafafou M., Nachaoui M., Chakib A. Supervised learning as an inverse problem based on non-smooth loss function. Knowledge and Information Systems. 62, 3039–3058 (2020). | |
dc.relation.referencesen | [11] Lyaqini S., Nachaoui M., Quafafou M. Non-smooth classification model based on new smoothing technique. Journal of Physics: Conference Series. 1743, 012025 (2021). | |
dc.relation.referencesen | [12] Nachaoui M. Parameter learning for combined first and second order total variation for image reconstruction. Advanced Mathematical Models & Applications. 5 (1), 53–69 (2020). | |
dc.relation.referencesen | [13] El Mourabit I., El Rhabi M., Hakim A., Laghrib A., Moreau E. A new denoising model for multi-frame super-resolution image reconstruction. Signal Processing. 132, 51–65 (2017). | |
dc.relation.referencesen | [14] Chen C., Mangasarian O. L. A class of smoothing functions for nonlinear and mixed complementarity problems. Computational Optimization and Applications. 5 (2), 97–138 (1996). | |
dc.relation.referencesen | [15] Lee Y. J., Hsieh W. F., Huang C. M. "/spl epsi/-SSVR: a smooth support vector machine for epsiloninsensitive regression. IEEE Transactions on Knowledge & Data Engineering. 17 (5), 678–685 (2005). | |
dc.relation.referencesen | [16] Hajewski J., Oliveira S., Stewart D. Smoothed Hinge Loss and ℓ 1 Support Vector Machines. 2018 IEEE International Conference on Data Mining Workshops (ICDMW). 1217–1223 (2018). | |
dc.relation.referencesen | [17] D´efossez A., Bottou L., Bach F., Usunier N. On the convergence of Adam and Adagrad. arXiv preprint arXiv:2003.02395 (2020). | |
dc.relation.referencesen | [18] Fei Z., Wu Z., Xiao Y., Ma J., He W. A new short-arc fitting method with high precision using Adam optimization algorithm. Optik. 212, 164788 (2020). | |
dc.relation.referencesen | [19] Rosales R., Schmidt M., Fung G. Fast Optimization Methods for L1 Regularization: A Comparative Study and Two New Approaches (2007). | |
dc.relation.referencesen | [20] Hadamard J. Lectures on Cauchy’s problem in linear partial differential equations. New Haven, Yale University Press (1923). | |
dc.relation.referencesen | [21] Girosi F., Jones M., Poggio T. Regularization theory and neural networks architectures. Neural computation. 7 (2), 219–269 (1995). | |
dc.relation.referencesen | [22] Sch¨olkopf B., Herbrich R., Smola A. J. A generalized representer theorem. International conference on computational learning theory. 416–426 (2001). | |
dc.relation.referencesen | [23] Boyd S., Vandenberghe L. Convex Optimization. Cambridge University Press, New York, USA (2004). | |
dc.relation.referencesen | [24] Ruder S. An overview of gradient descent optimization algorithms. arXiv preprint arXiv:1609.04747 (2016). | |
dc.relation.referencesen | [25] Efron B., Hastie T., Johnstone I., Tibshirani R. Least angle regression. Annals of statistics. 32 (2), 407–499 (2004). | |
dc.rights.holder | © Національний університет “Львівська політехніка”, 2021 | |
dc.subject | контрольоване навчання | |
dc.subject | гладке наближення | |
dc.subject | алгоритм Адама | |
dc.subject | діагностика діабету | |
dc.subject | регуляризація Тихонова | |
dc.subject | гладка оптимізація | |
dc.subject | supervised learning | |
dc.subject | smooth approximation | |
dc.subject | Adam algorithm | |
dc.subject | diabetes diagnosis | |
dc.subject | Tikhonov regularization | |
dc.subject | smooth optimization | |
dc.title | Diabetes prediction using an improved machine learning approach | |
dc.title.alternative | Прогнозування діабету за допомогою вдосконаленого машинного навчання | |
dc.type | Article |
Files
License bundle
1 - 1 of 1