Distilling Knowledge or Transferring Weights? An Experimental Perspective on Classifiers

dc.contributor.authorÖğ, Merve
dc.contributor.authorYıldızlı, Beyza
dc.contributor.authorKuş, Zeki
dc.contributor.authorAydın, Musa
dc.date.accessioned2026-04-24T09:17:04Z
dc.date.issued2025
dc.departmentFSM Vakıf Üniversitesi, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümü
dc.description.abstractThis study presents a systematic comparative analysis of knowledge distillation and transfer learning methodologies applied to image classification on the CIFAR-10 dataset. Using ResNet-18 architectures as the baseline, we investigate the trade-offs between model complexity, computational efficiency, and classification performance under various optimization strategies. Results demonstrate that knowledge distillation consistently outperforms transfer learning across all tested configurations. Most notably, a lightweight ResNet- 18 student model (2.84M parameters) guided by a ResNet-18 teacher achieved 89.03% accuracy, significantly exceeding transfer learning's 86.36% maximum accuracy despite using only 25% of the parameters. This improvement changes how we optimize models. It shows that using soft targets for knowledge transfer can beat the usual trade-off between a model's size and how well it performs. This makes it useful for places with limited resources.
dc.identifier.citationÖĞ, Merve, Beyza YILDIZLI, Zeki KUŞ & Musa AYDIN. "Distilling Knowledge or Transferring Weights? An Experimental Perspective on Classifiers". 2025 16th International Conference on Electrical and Electronics Engineering, (2025): 1-5.
dc.identifier.doi10.1109/ELECO69582.2025.11329365
dc.identifier.endpage5
dc.identifier.orcid0000-0001-8762-7233
dc.identifier.orcid0000-0002-5825-2230
dc.identifier.scopus2-s2.0-105034868150
dc.identifier.scopusqualityN/A
dc.identifier.startpage1
dc.identifier.urihttps://hdl.handle.net/11352/6086
dc.indekslendigikaynakScopus
dc.language.isoen
dc.publisherELECO
dc.relation.ispartof2025 16th International Conference on Electrical and Electronics Engineering
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/embargoedAccess
dc.titleDistilling Knowledge or Transferring Weights? An Experimental Perspective on Classifiers
dc.typeConference Object

Dosyalar

Orijinal paket

Listeleniyor 1 - 1 / 1
Yükleniyor...
Küçük Resim
İsim:
Öğ.pdf
Boyut:
1.06 MB
Biçim:
Adobe Portable Document Format

Lisans paketi

Listeleniyor 1 - 1 / 1
Yükleniyor...
Küçük Resim
İsim:
license.txt
Boyut:
1.17 KB
Biçim:
Item-specific license agreed upon to submission
Açıklama: