Enhancing Food Image Classification with Particle Swarm Optimization on NutriFoodNet and Data Augmentation Parameters

Authors

  • Sreetha E S Research Scholar
  • G Naveen Sundar
  • D Narmadha

DOI:

https://doi.org/10.22399/ijcesen.493

Keywords:

Food recognition, Data augmentation , Convolutional Neural Network , Particle Swarm Optimization , NutriFoodNet

Abstract

A convolutional neural network (CNN) architecture, NutriFoodNet, enhanced through Particle Swarm Optimization (PSO) is suggested in this paper to optimize data augmentation parameters and key hyperparameters, specifically designed for food image recognition. Accurate food image classification plays a vital function in various applications, including nutrition management, dietary assessment, and healthcare, as it aids in the automated recognition and analysis of food items from images. The implementation aimed to improve classification accuracy on the Food101 dataset. Initially, the NutriFoodNet model achieved an accuracy of 97.3%. By applying PSO, the model's performance was further refined, resulting in an increased accuracy of 98.5%. This optimized system was benchmarked against state-of-the-art architectures, including ResNet-18, ResNet-50, and Inception V3, showcasing its exceptional performance. The proposed system highlights the efficiency of PSO in fine-tuning augmentation parameters and CNN hyperparameters, leading to significant improvements in model accuracy for food image classification tasks. This advancement underscores the potential of enhanced food image classification systems in contributing to better dietary monitoring and healthcare outcomes.

References

He, K., Zhang, X., Ren, S., & Sun, J. (2016, June). Deep residual learning for image recognition. IEEE conference on computer vision and pattern recognition (CVPR)* (pp. 770–778). IEEE. https://doi.org/10.1109/CVPR.2016.90

Huang, G., Liu, Z., Van Der Maaten, L., & Weinberger, K. Q. (2017). Densely connected convolutional networks. IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 2261–2269). IEEE. https://doi.org/10.1109/CVPR.2017.243.

Alzubaidi, L., Zhang, J., Humaidi, A. J., Al-Dujaili, A., Duan, Y., Al-Shamma, O., … Farhan, L. (2021). Review of deep learning: concepts, CNN architectures, challenges, applications, future directions. Journal of Big Data, 8(1). doi:10.1186/s40537-021-00444-8

Schoenauer, M., & Ronald, E. (1994). Neuro-genetic truck backer-upper controller. In Proceedings of the First IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Intelligence (pp. 720–723). IEEE.https://doi.org/10.1109/ICEC.1994.349969

Shorten, C., & Khoshgoftaar, T. M. (2019). A survey on Image Data Augmentation for Deep Learning. Journal of Big Data, 6(1). doi:10.1186/s40537-019-0197-0

Raiaan, M. A. K., Sakib, S., Fahad, N. M., Mamun, A. A., Rahman, M. A., Shatabda, S., & Mukta, M. S. H. (2024). A systematic review of hyperparameter optimization techniques in convolutional neural networks. Decision Analytics Journal, 11, 100470. https://doi.org/10.1016/j.dajour.2024.100470

Nguyen, H.-P., Liu, J., & Zio, E. (2020). A long-term prediction approach based on long short-term memory neural networks with automatic parameter optimization by Tree-structured Parzen Estimator and applied to time-series data of NPP steam generators. Applied Soft Computing, 89, 106116. doi:10.1016/j.asoc.2020.106116

Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Grey wolf optimizer. Advances in Engineering Software, 69, 46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007

Yoon, J. H., & Geem, Z. W. (2021). Empirical convergence theory of harmony search algorithm for box-constrained discrete optimization of convex function. Mathematics, 9(545). https://doi.org/10.3390/math9050545

Storn, R., & Price, K. (1997). Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization, 11, 341–359.

Katoch, S., Chauhan, S. S., & Kumar, V. (2021). A review on genetic algorithm: Past, present, and future. Multimedia Tools and Applications, 80, 8091–8126. https://doi.org/10.1007/s11042-020-10139-6

Dorigo, M., Birattari, M., & Stutzle, T. (2006). Ant colony optimization. IEEE Computational Intelligence Magazine, 1(4), 28–39. https://doi.org/10.1109/MCI.2006.190679

Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In Proceedings of ICNN'95 - International Conference on Neural Networks (Vol. 4, pp. 1942–1948). IEEE. https://doi.org/10.1109/ICNN.1995.488968

Yang, X.-S. (2010). Nature-inspired metaheuristic algorithms. Luniver Press.

Ozaki, Y., Yano, M., & Onishi, M. (2017). Effective hyperparameter optimization using Nelder-Mead method in deep learning. IPSJ Transactions on Computer Vision and Applications, 9, 1–12. https://doi.org/10.118

/s41045-017-0042-1

Frazier, P. I. (2018). A tutorial on Bayesian optimization. arXiv preprint arXiv:1807.02811. Retrieved from https://arxiv.org/abs/1807.02811

Lee, W.-Y., Park, S.-M., & Sim, K.-B. (2018). Optimal hyperparameter tuning of convolutional neural networks based on the parameter-setting-free harmony search algorithm. Optik, 172, 359–367. https://doi.org/10.1016/j.ijleo.2018.07.044

Singh, P., Chaudhury, S., & Panigrahi, B. K. (2021). Hybrid MPSO-CNN: Multi-level particle swarm optimized hyperparameters of convolutional neural network. Swarm and Evolutionary Computation, 63, 100863. https://doi.org/10.1016/j.swevo.2021.100863

Bacanin, N., Bezdan, T., Tuba, E., Strumberger, I., & Tuba, M. (2020). Optimizing convolutional neural network hyperparameters by enhanced swarm intelligence metaheuristics. Algorithms, 13(3), 67. https://doi.org/10.3390/a13030067

Liu, D., Ouyang, H., Li, S., Zhang, C., & Zhan, Z.-H. (2023). Hyperparameters optimization of convolutional neural network based on local autonomous competition harmony search algorithm. Journal of Computational Design and Engineering. https://doi.org/10.1093/jcde/qwad050

Lankford, S., & Grimes, D. (2020). Neural architecture search using particle swarm and ant colony optimization. In Proceedings of the AICS 2020 (pp. 229–240).

Yeh, W.-C., Lin, Y.-P., Liang, Y.-C., Lai, C.-M., & Huang, C.-L. (2023). Simplified swarm optimization for hyperparameters of convolutional neural networks. Computers and Industrial Engineering, 177, 109076. https://doi.org/10.1016/j.cie.2023.109076

Serizawa, T., & Fujita, H. (2020). Optimization of convolutional neural network using the linearly decreasing weight particle swarm optimization. arXiv preprint arXiv:2001.05670. Retrieved from https://arxiv.org/abs/2001.05670

Sharaf, A. I., & Radwan, E. F. (2019). An automated approach for developing a convolutional neural network using a modified firefly algorithm for image classification. In Applications of Firefly Algorithm and Its Variants: Case Studies and New Developments (pp. 99–118). Springer.

Albelwi, S., & Mahmood, A. (2016). Automated optimal architecture of deep convolutional neural networks for image recognition. In 2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA) (pp. 53–60). IEEE.

Rosa, G., Papa, J., Marana, A., Scheirer, W., & Cox, D. (2015). Finetuning convolutional neural networks using harmony search. In Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications: 20th Iberoamerican Congress, CIARP 2015 (pp. 683–690). Springer.

Huang, Y.-F., & Liu, J.-S. (2019). Optimizing convolutional neural network architecture using a self-adaptive harmony search algorithm. In International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (pp. 3–12). Springer.

Sun, Y., Xue, B., Zhang, M., & Yen, G. G. (2018). A particle swarm optimization-based flexible convolutional autoencoder for image classification. IEEE Transactions on Neural Networks and Learning Systems, 30(8), 2295–2309. https://doi.org/10.1109/TNNLS.2018.2803384

Albelwi, S., & Mahmood, A. (2016). Automated optimal architecture of deep convolutional neural networks for image recognition. In 2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA) (pp. 53–60). IEEE.

Gudise, V., & Venayagamoorthy, G. (2003). Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks. In Proceedings of the 2003 IEEE Swarm Intelligence Symposium (SIS’03) (Vol. 2, pp. 110–117). IEEE. https://doi.org/10.1109/SIS.2003.1202255

Carvalho, M., & Ludermir, T. (2006). Particle swarm optimization of feed-forward neural networks with weight decay. In 2006 Sixth International Conference on Hybrid Intelligent Systems (HIS’06) (pp. 5–5). IEEE. https://doi.org/10.1109/HIS.2006.264888

Carvalho, M., & Ludermir, T. B. (2007). Particle swarm optimization of neural network architectures and weights. In 7th International Conference on Hybrid Intelligent Systems (HIS 2007) (pp. 336–339). IEEE. https://doi.org/10.1109/HIS.2007.45

Kiranyaz, S., Ince, T., Yildirim, A., & Gabbouj, M. (2009). Evolutionary artificial neural networks by multi-dimensional particle swarm optimization. Neural Networks, 22(10), 1448–1462. https://doi.org/10.1016/j.neunet.2009.05.013

Zhang, J.-R., Zhang, J., Lok, T.-M., & Lyu, M. R. (2007). A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training. Applied Mathematics and Computation, 185(2), 1026–1037. https://doi.org/10.1016/j.amc.2006.07.025

Qolomany, B., Maabreh, M., Al-Fuqaha, A., Gupta, A., & Benhaddou, D. (2017). Parameters optimization of deep learning models using particle swarm optimization. In 2017 13th International Wireless Communications and Mobile Computing Conference (IWCMC) (pp. 1285–1290). IEEE. https://doi.org/10.1109/IWCMC.2017.7986470

Kenny, A., & Li, X. (2017). A study on pre-training deep neural networks using particle swarm optimization. In Simulated Evolution and Learning: 11th International Conference, SEAL 2017 (pp. 361–372). https://doi.org/10.1007/978-3-319-68759-9_30

Fernandes Junior, F. E., & Yen, G. (2019). Particle swarm optimization of deep neural networks architectures for image classification. Swarm and Evolutionary Computation, 49, 62–74. https://doi.org/10.1016/j.swevo.2019.05.010

Aguerchi, K., Jabrane, Y., Habba, M., & El Hassani, A. H. (2024). A CNN hyperparameters optimization based on particle swarm optimization for mammography breast cancer classification. Journal of Imaging, 10(2), 30. https://doi.org/10.3390/jimaging10020030

Alhudhaif, A., Saeed, A., Imran, T., Kamran, M., & Alghamdi, A. S. (2022). A particle swarm optimization based deep learning model for vehicle classification. Computer Systems Science and Engineering, 40(1), 223–235.

Saleem, M. A., Aamir, M., Ibrahim, R., Senan, N., & Alyas, T. (2022). An optimized convolutional neural network architecture for paddy disease classification. Computers, Materials & Continua, 71(3), 6053–6067. https://doi.org/10.32604/cmc.2022.022215

Liu, X., Zhang, C., Cai, Z., Yang, J., Zhou, Z., & Gong, X. (2021). Continuous particle swarm optimization-based deep learning architecture search for hyperspectral image classification. Remote Sensing, 13(6), 1082. https://doi.org/10.3390/rs13061082

Zhou, C., & Xiong, A. (2023). Fast image super-resolution using particle swarm optimization-based convolutional neural networks. Sensors, 23(4), 1923. https://doi.org/10.3390/s23041923

Nistor, Sergiu & Czibula, Gabriela. (2021). IntelliSwAS: Optimizing deep neural network architectures using a particle swarm-based approach. Expert Systems with Applications. 187. 115945. 10.1016/j.eswa.2021.115945.

Sreetha, E. S., Sundar, G. N., Narmadha, D., Sagayam, K. M., & Elngar, A. A. (2023). Technologies for Healthcare 4.0: From AI and IoT to blockchain.

Bossard, L., Guillaumin, M., Van Gool, L. (2014). Food-101 – Mining Discriminative Components with Random Forests. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds) Computer Vision – ECCV 2014. ECCV 2014. Lecture Notes in Computer Science, vol 8694. Springer, Cham. https://doi.org/10.1007/978-3-319-10599-4_29

Mumuni, A., & Mumuni, F. (2022). Data augmentation: A comprehensive survey of modern approaches. Array, 16, 100258. https://doi.org/10.1016/j.array.2022.100258

BACAK, A., ŞENEL, M., & GÜNAY, O. (2023). Convolutional Neural Network (CNN) Prediction on Meningioma, Glioma with Tensorflow. International Journal of Computational and Experimental Science and Engineering, 9(2), 197–204. Retrieved from https://ijcesen.com/index.php/ijcesen/article/view/210

Priti Parag Gaikwad, & Mithra Venkatesan. (2024). KWHO-CNN: A Hybrid Metaheuristic Algorithm Based Optimzed Attention-Driven CNN for Automatic Clinical Depression Recognition . International Journal of Computational and Experimental Science and Engineering, 10(3).491-506 https://doi.org/10.22399/ijcesen.359

Agnihotri, A., & Kohli, N. (2024). A novel lightweight deep learning model based on SqueezeNet architecture for viral lung disease classification in X-ray and CT images. International Journal of Computational and Experimental Science and Engineering, 10(4).592-613 https://doi.org/10.22399/ijcesen.425

Jha, K., Sumit Srivastava, & Aruna Jain. (2024). A Novel Texture based Approach for Facial Liveness Detection and Authentication using Deep Learning Classifier. International Journal of Computational and Experimental Science and Engineering, 10(3). 323-331 https://doi.org/10.22399/ijcesen.369

Radhi, M., & Tahseen, I. (2024). An Enhancement for Wireless Body Area Network Using Adaptive Algorithms. International Journal of Computational and Experimental Science and Engineering, 10(3).388-396 https://doi.org/10.22399/ijcesen.409

Sreedharan, S.E., Sundar, G.N., Narmadha, D. (2024). NutriFoodNet: A high-accuracy convolutional neural network for automated food image recognition and nutrient estimation. Traitement du Signal, 41(4);1953-1965. https://doi.org/10.18280/ts.410425

Downloads

Published

2024-10-16

How to Cite

Sreetha E S, G Naveen Sundar, & D Narmadha. (2024). Enhancing Food Image Classification with Particle Swarm Optimization on NutriFoodNet and Data Augmentation Parameters. International Journal of Computational and Experimental Science and Engineering, 10(4). https://doi.org/10.22399/ijcesen.493

Issue

Section

Research Article