Mostra i principali dati dell'item

dc.contributor.authorAbbasi, Mahmoud 
dc.contributor.authorShahraki, Amin
dc.contributor.authorPrieto Tejedor, Javier 
dc.contributor.authorGonzález Arrieta, María Angélica 
dc.contributor.authorCorchado Rodríguez, Juan Manuel 
dc.date.accessioned2025-07-04T10:33:54Z
dc.date.available2025-07-04T10:33:54Z
dc.date.issued2024-01-31
dc.identifier.citationM. Abbasi, A. Shahraki, J. Prieto, A. G. Arrieta and J. M. Corchado, "Unleashing the Potential of Knowledge Distillation for IoT Traffic Classification," in IEEE Transactions on Machine Learning in Communications and Networking, vol. 2, pp. 221-239, 2024, doi: 10.1109/TMLCN.2024.336091es_ES
dc.identifier.urihttp://hdl.handle.net/10366/166338
dc.description.abstract[EN]The Internet of Things (IoT) has revolutionized our lives by generating large amounts of data, however, the data needs to be collected, processed, and analyzed in real-time. Network Traffic Classification (NTC) in IoT is a crucial step for optimizing network performance, enhancing security, and improving user experience. Different methods are introduced for NTC, but recently Machine Learning solutions have received high attention in this field, however, Traditional Machine Learning (ML) methods struggle with the complexity and heterogeneity of IoT traffic, as well as the limited resources of IoT devices. Deep learning shows promise but is computationally intensive for resource-constrained IoT devices. Knowledge distillation is a solution to help ML by compressing complex models into smaller ones suitable for IoT devices. In this paper, we examine the use of knowledge distillation for IoT traffic classification. Through experiments, we show that the student model achieves a balance between accuracy and efficiency. It exhibits similar accuracy to the larger teacher model while maintaining a smaller size. This makes it a suitable alternative for resource-constrained scenarios like mobile or IoT traffic classification. We find that the knowledge distillation technique effectively transfers knowledge from the teacher model to the student model, even with reduced training data. The results also demonstrate the robustness of the approach, as the student model performs well even with the removal of certain classes. Additionally, we highlight the trade-off between model capacity and computational cost, suggesting that increasing model size beyond a certain point may not be beneficial. The findings emphasize the value of soft labels in training student models with limited data resources.es_ES
dc.description.sponsorshipEuropean Uniones_ES
dc.language.isoenges_ES
dc.publisherInstitute of Electrical and Electronics Engineers Inc.es_ES
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectNetwork traffic classification (NTC)es_ES
dc.subjectIoTes_ES
dc.subjectMachine learninges_ES
dc.subjectNetwork managementes_ES
dc.subjectKnowledge distillationes_ES
dc.subjectIoT traffic classificationes_ES
dc.titleUnleashing the Potential of Knowledge Distillation for IoT Traffic Classificationes_ES
dc.typeinfo:eu-repo/semantics/articlees_ES
dc.relation.publishversionhttps://ieeexplore.ieee.org/abstract/document/10417087es_ES
dc.subject.unesco1203.04 Inteligencia Artificiales_ES
dc.identifier.doi10.1109/TMLCN.2024.3360915
dc.relation.projectIDinfo:eu-repo/grantAgreement/EC/H2020/953442/EUes_ES
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses_ES
dc.identifier.essn2831-316X
dc.journal.titleIEEE Transactions on Machine Learning in Communications and Networkinges_ES
dc.volume.number2es_ES
dc.page.initial221es_ES
dc.page.final239es_ES
dc.type.hasVersioninfo:eu-repo/semantics/publishedVersiones_ES


Files in questo item

Thumbnail

Questo item appare nelle seguenti collezioni

Mostra i principali dati dell'item

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Excepto si se señala otra cosa, la licencia del ítem se describe como Attribution-NonCommercial-NoDerivatives 4.0 Internacional