A Bio-inspired Approach to Sentiment Analysis using Ant Colony Optimization
Keywords:
Sentiment Analysis, Ant Colony Optimization, Feature Selection, NLP, Machine Learning, Deep Learning, ACOAbstract
Sentiment analysis is a crucial natural language processing (NLP) task, particularly for customer reviews. Most of the classical sentiment analysis models are not efficient to handle highdimensional data, and thus computational complexity increases and efficiency decreases. In this paper, a new bio-inspired sentiment analysis approach is proposed by employing Ant Colony Optimization (ACO) for feature selection. Through the imitation of ant colonies' behavior, ACO can effectively discover and pick out the most promising features of text data and improve the accuracy and efficiency of classification models. We have tested the ACO-based solution on Amazon product reviews and applied it to both standard machine learning classifiers (Logistic Regression - 80.5%, SVM - 80.5%, Naive Bayes - 79.5%, Random Forest - 79.0%) and deep learning models (LSTM - 78.5%, BERT - 80.0%). The ACO algorithm was more accurate and less computationally costly in all the models, but particularly the Logistic Regression classifier. The present study validates ACO as a scalable and effective method for real-world application in e- commerce, business intelligence, and social surveillance.
Downloads
References
M.S. Akhtar, D. Gupta, A. Ekbal, P. Bhattacharyya, Feature selection and ensemble construction: a twostep method for aspect-based sentiment analysis, Knowl. Based Syst. 125 (2017) 116–135, https://doi.org/10.1016/j.knosys.2017.03.020.
S.L. Ramaswamy, J. Chinnappan, RecogNetLSTM+CNN: a hybrid network with attention mechanism for aspect categorization and sentiment classification, J. Intell. Inf. Syst. 58 (2022) 79–404, https://doi.org/10.1007/s10844-021-00692-3.
X.C. Hou, J. Huang, G.T. Wang, K. Huang, X.D. He, B. Zhou, Selective attention-based graph convolutional networks for aspect-level sentiment classification, in: Proc. CoRR, 2019. February[Online].Available:https://arxiv.org/abs/19 10.1 0857.
Minakshi Tomer, Manoj Kumar, Multi-document extractive text summarization based on firefly algorithm, Journal of King Saud University - Computer and Information Sciences, Volume 34, Issue 8, Part B, 2022, Pages 6057-6065, ISSN 13191578, https://doi.org/10.1016/j.jksuci.2021.04.004.
Marie-Sainte, Souad Larabi, and Nada Alalyani. "Firefly algorithm-based feature selection for Arabic text classification." Journal of King Saud University-Computer and Information Sciences 32.3 (2020): 320-328.
F. Akbarian and F. Z. Boroujeni, "An Improved Feature Selection Method for Sentiments Analysis in Social Networks," 2020 10th International Conference on Computer and Knowledge Engineering (ICCKE), 2020, pp. 181-186, doi: 10.1109/ICCKE50421.2020.9303710
S.-T. Oh, J.-E. Park, J. Jeong, and S. Hong, “Enhancing Ozone Nowcasting over East Asia using a Data-to-Data Translation Approach with Observations from a Geostationary Environment Monitoring Spectrometer,” Atmos. Pollut. Res., p. 102054, 2024.
C. Betancourt, T. Stomberg, R. Roscher, M. G. Schultz, and S. Stadtler, “AQ-Bench: A BenchmarkDataset for Machine Learning on Global Air quality Metrics,” Earth Syst. Sci. Data, vol. 13, no. 6, pp. 3013–3033, 2021, doi: 10.5194/essd-13-3013-2021.
R. Kohavi and G. H. John, “Wrappers for Feature Subset Selection,” Artif. Intell., vol. 97, no. 1–2, pp. 273–324, 1997.
R. Muthukrishnan and R. Rohini, “LASSO: A Feature Selection Technique in Predictive Modeling for Machine Learning,” in 2016 IEEE international conference on advances in computer applications (ICACA), 2016, pp. 18–20.
G. Chandrashekar and F. Sahin, “A Survey on Feature Selection Methods,” Compute. &Electra. Eng., vol. 40, no. 1, pp. 16–28, 2014.
R. Gupta, A. K. Yadav, S. K. Jha, and P. K. Pathak, Composition of Feature Selection Techniques for Improving the Global Horizontal Irradiance Estimation via Machine Learning Models,” Therm. Sci. Eng. Prog., vol. 48, p. 102394, 2024.
Zhao, P., & Wang, Z. (2021). "Improving Sentiment Analysis Performance with PSO-Based Feature Selection and Deep Learning Models." Journal of Machine Learning Research, vol. 22, no. 3, pp. 101-115.
Wang, Z., & Zhang, Y. (2020). "A Review on Feature Selection for Text Classification Using Particle Swarm Optimization." IEEE Access, vol. 8, pp. 174482-174494.
Lee, H., & Kim, Y. (2020). "Feature Selection for Text Classification Using Particle Swarm Optimization." IEEE Transactions on Neural Networks and Learning Systems, vol. 31, no. 6, pp. 1969-1976.
Kumar, S., & Shah, P. (2020). "A Hybrid Feature Selection Approach Using PSO for Sentiment Classification of Social Media Data." Information Processing & Management, vol. 57, no. 2, pp. 102126.
Yadav, P., & Pandey, R. (2021). "Sentiment Analysis of Online Reviews Using Particle Swarm Optimization and Support Vector Machines." Soft Computing, vol. 25, no. 4, pp. 6737-6746.
Zhang, Y., & Zhao, J. (2020). "A Review on Particle Swarm Optimization for Feature Selection in Text Mining." Journal of Computational Science, vol. 43,
pp. 101123.
Wu, C., & Li, W. (2020). "An Optimized Feature Selection Approach for Sentiment Analysis Using PSO." Journal of Computational and Applied Mathematics, vol. 379, pp. 112979.
Zheng, F., & Chen, W. (2021). "Feature Selection for Text Classification Using PSO and SVM." Soft Computing, vol. 25, no. 12, pp. 8349-8357.
Li, L., & Zhang, Y. (2021). "Sentiment Analysis Using PSO and Neural Networks." Expert Systems with Applications, vol. 165, pp. 113692.
Downloads
Published
How to Cite
Issue
Section
License

This work is licensed under a Creative Commons Attribution 4.0 International License.
You are free to:
- Share — copy and redistribute the material in any medium or format
- Adapt — remix, transform, and build upon the material for any purpose, even commercially.
Terms:
- Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
- No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.