Analisis Efek Augmentasi Dataset dan Fine Tune pada Algoritma Pre-Trained Convolutional Neural Network (CNN)

Penulis

  • Theopilus Bayu Sasongko Universitas Amikom Yogyakarta, Yogyakarta
  • Haryoko Haryoko Universitas Amikom Yogyakarta, Yogyakarta
  • Agit Amrullah Universitas Amikom Yogyakarta, Yogyakarta

DOI:

https://doi.org/10.25126/jtiik.20241046583

Abstrak

Kemajuan teknologi deep learning seringkali berbanding lurus dengan keterkaitan metode yang dapat diandalkan dalam penggunaan jumlah data yang besar. Convolutional Neural Network (CNN) adalah salah satu algoritma deep learning yang paling popular saat ini guna pengolahan citra. Pada era deep learning model CNN yang kompleks seperti saat ini memiliki tantangan-tantangan yang baru baik gradient vanishing, overfitting yang dikarenakan keterbatasan dataset, optimasi parameter hingga keterbatasan perangkat keras. Penelitian ini bertujuan mengukur pengaruh teknik fine tuning dan augmentasi dataset pada model transfer learning CNN Mobilenet, Efficientnet, dan Nasnetmobile dengan dataset yang variasi jumlah dataset yang memiliki jumlah yang terbatas. Pada hasil dari penelitian ini, dari ketiga dataset yang digunakan sebagai dalam melakukan training pada model efisien transfer learning baik MobileNet, EfficientNet, dan NasNetmobile, teknik augmentasi zoom range ataupun random erase dapat meningkatkan akurasi pada dataset dengan jumlah 56 citra dan 222 citra, sedangkan pada dataset dengan jumlah 500 data citra, semua teknik augmentasi terbukti dapat meningkatkan akurasi pada model arsitektur MobileNetV2 dan NasNetMobile. Sedangkan teknik fine tuning terbukti efektif dalam meningkatkan akurasi pada semua skala data yang kecil.

 

Abstract

Today deep learning technology is often associated with reliable processes (methods) when we have large amounts of data. In deep learning CNN (Convolutional Neural Network) plays a very important role which is often used to analyze (classify or recognize) visual images. In the era of deep learning models such as the complex Convolutional Neural Network (CNN) as it is today, it has new challenges such as gradient vanishing, overfitting due to dataset limitations, parameter optimization to hardware limitations. The MobileNet architecture was coined in 2017 by Howards, et al, which is one of the convolutional neural networks (CNN) architectures that can be used to overcome the need for excessive computing resources. This study aims to measure the effect of fine tune and dataset augmentation techniques on CNN mobilenet, efficientnet, and nasnetmobile transfer learning models with very small datasets. The results of this study are that of the three datasets used as the basis for training in efficient transfer learning models (mobilenet, efficientnet, and nasnetmobile), random erase and zoom range augmentation techniques dominate the increase in model accuracy. The amount of increase in accuracy after random erase or zoom range augmentation that occurs is about 0.03% to 0.1%.

Downloads

Download data is not yet available.

Referensi

FUKUSHIMA, K., 1980. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biological Cybernetics, 36(4), 193–202. https://doi.org/10.1007/BF00344251

HOWARD, A. G., ZHU, M., CHEN, B., KALENICHENKO, D., WANG, W., WEYAND, T., ANDREETTO, M., & ADAM, H., 2017. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. http://arxiv.org/abs/1704.04861

HUSSAIN, Z., GIMENEZ, F., YI, D., & RUBIN, D., 2017. Differential Data Augmentation Techniques for Medical Imaging Classification Tasks. AMIA ... Annual Symposium Proceedings. AMIA Symposium, 2017, 979–984.

JIANG, M. T.-J., WU, S.-H., CHEN, Y.-K., GU, Z.-X., CHIANG, C.-J., WU, Y.-C., HUANG, Y.-C., CHIU, C.-H., SHAW, S.-R., & DAY, M.-Y., 2020. Fine-tuning techniques and data augmentation on transformer-based models for conversational texts and noisy user-generated content. 2020 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining (ASONAM), 919–925. https://doi.org/10.1109/ASONAM49781.2020.9381329

PAN, S. J., & YANG, Q., 2010. A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, 22(10), 1345–1359. https://doi.org/10.1109/TKDE.2009.191

RADENOVIC, F., TOLIAS, G., & CHUM, O., 2019. Fine-Tuning CNN Image Retrieval with No Human Annotation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 41(7), 1655–1668. https://doi.org/10.1109/TPAMI.2018.2846566

SHIJIE, J., PING, W., PEIYI, J., & SIPING, H., 2017. Research on data augmentation for image classification based on convolution neural networks. 2017 Chinese Automation Congress (CAC), 4165–4170. https://doi.org/10.1109/CAC.2017.8243510

TAN, M., & LE, Q. V., 2019. EfficientNet: Rethinking model scaling for convolutional neural networks. 36th International Conference on Machine Learning, ICML 2019, 2019-June, 10691–10700.

TAORMINA, V., CASCIO, D., ABBENE, L., & RASO, G., 2020. Performance of Fine-Tuning Convolutional Neural Networks for HEp-2 Image Classification. Applied Sciences, 10(19). https://doi.org/10.3390/app10196940

YILMAZ, F., & DEMİR, A., 2020. Cutting Effect on Classification Using Nasnet Architecture. 2020 Medical Technologies Congress (TIPTEKNO), 1–3. https://doi.org/10.1109/TIPTEKNO50054.2020.9299313

ZHONG, Z., ZHENG, L., KANG, G., LI, S., & YANG, Y., 2020. Random erasing data augmentation. AAAI 2020 - 34th AAAI Conference on Artificial Intelligence, 13001–13008. https://doi.org/10.1609/aaai.v34i07.7000

Diterbitkan

30-08-2023

Terbitan

Bagian

Ilmu Komputer

Cara Mengutip

Analisis Efek Augmentasi Dataset dan Fine Tune pada Algoritma Pre-Trained Convolutional Neural Network (CNN). (2023). Jurnal Teknologi Informasi Dan Ilmu Komputer, 10(4), 763-768. https://doi.org/10.25126/jtiik.20241046583