Posts by Collection

publications

A new strategy for curriculum learning using model distillation

Published in Global Journal of Computer Sciences: Theory and Research, 2020

This study introduces an innovative curriculum learning strategy for deep neural networks, inspired by human and animal learning patterns. By leveraging transfer learning with a pre-trained Xception model on ImageNet and implementing a novel sample sorting methodology, we demonstrate improved training efficiency on CIFAR-10 and CIFAR-100 datasets. Our approach consistently achieves over 1% higher accuracy per epoch compared to random training, highlighting the benefits of structured learning progression in neural network training.

Recommended citation: K. Karakose and M. Bilgin (2020). "“A new strategy for curriculum learning using model distillation”." Global Journal of Computer Sciences: Theory and Research. https://un-pub.eu/ojs/index.php/gjcs/article/view/5810