Sentiment Analysis in Retail Leveraging BERT and NLP Techniques for Customer Insights

Authors

  • Sreepal Reddy Bolla Independent Researcher, Charlotte, NC, USA Author

DOI:

https://doi.org/10.32628/CSEIT2425482

Keywords:

Sentiment Analysis, BERT, Natural Language Processing (NLP), Retail Analytics, Customer Insights, Deep Learning, Machine Learning, Text Classification, Transformer Models, E-Commerce

Abstract

The speedy e-commerce and digital retail expansion generated online reviews and social network posts and feedback surveys among an explosion of customer-generated content. Organizations wanting to gain maximum customer satisfaction along with optimized marketing plans and expanded product lines need to analyze consumer sentiment from unstructured data. Traditional sentiment analysis systems face challenges analyzing contextual meaning as well as detecting sarcasm from unclear language and contextual confusion in both classical machine learning models and rule-based algorithms. Sentiment analysis of retail consumer feedback involves Natural Language Processing (NLP) together with the Bidirectional Encoder Representations from Transformers (BERT). BERT achieves remarkable text categorisation results when it captures sentiments together with deep contextual relationships with greater precision. The research explores ways to boost sentiment prediction capabilities through BERT optimization on datasets that focus on the retail industry. The proposed methodology includes data preprocessing along with feature extraction steps and BERT-based architecture classification methods using DistilBERT, RoBERTa and ALBERT. The performance assessment relies on comparison tests between traditional machine learning models Naïve Bayes and Support Vector Machines (SVM) and LSTMs. The paper addresses essential obstacles which include noisy data management alongside language variant control alongside computational challenges.

Downloads

Download data is not yet available.

References

Liu, B. (2012). Sentiment Analysis and Opinion Mining. Morgan & Claypool Publishers.

Cambria, E., Schuller, B., Xia, Y., & Havasi, C. (2013). New avenues in opinion mining and sentiment analysis. IEEE Intelligent Systems, 28(2), 15-21.

Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. arXiv preprint arXiv:1810.04805.

Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., ... & Stoyanov, V. (2019). RoBERTa: A Robustly Optimized BERT Pretraining Approach. arXiv preprint arXiv:1907.11692.

Kannan, R., & Gurusamy, P. (2019). Data Cleaning Framework for Enhancing Data Quality. Journal of Data Science and Analytics, 1(1), 30-41.

Kim, Y. (2014). Convolutional Neural Networks for Sentence Classification. arXiv preprint arXiv:1408.5882.

Biyani, P., Bhatia, S., & Mitra, P. (2014). Using Subjectivity Analysis to Improve Thread Retrieval in Online Forums. Proceedings of the 37th International ACM SIGIR Conference on Research & Development in Information Retrieval.

Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., & Soricut, R. (2019). ALBERT: A Lite BERT for Self-supervised Learning of Language Representations. arXiv preprint arXiv:1909.11942.

Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: Smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.

Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., ... & Polosukhin, I. (2017). Attention Is All You Need. Advances in Neural Information Processing Systems, 30.

Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., ... & Stoyanov, V. (2019). RoBERTa: A Robustly Optimized BERT Pretraining Approach. arXiv preprint arXiv:1907.11692.

Sanh, V., Debut, L., Chaumond, J., & Wolf, T. (2019). DistilBERT, a distilled version of BERT: Smaller, faster, cheaper and lighter. arXiv preprint arXiv:1910.01108.

Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., & Soricut, R. (2019). ALBERT: A Lite BERT for Self-supervised Learning of Language Representations. arXiv preprint arXiv:1909.11942.

Sun, C., Qiu, X., Xu, Y., & Huang, X. (2020). How to Fine-Tune BERT for Text Classification? Chinese Computational Linguistics, 194-206.

Zhang, Y., Yang, Q., & Yang, Y. (2021). Improving Sentiment Analysis in Retail by Combining BERT and CNN Models. Journal of Artificial Intelligence Research, 72, 150-167.

Gupta, A., & Singh, R. (2022). Enhancing Retail Sentiment Analysis with RoBERTa and Attention Mechanisms. IEEE Transactions on Neural Networks and Learning Systems, 33(5), 2503-2517.

Chen, D., Lin, J., & Xu, F. (2023). Fine-tuning BERT for Retail Sentiment Analysis Using Domain Adaptation Techniques. Proceedings of the 2023 International Conference on AI in Business and E-commerce.

Kumar, R., & Patel, S. (2023). Continual Learning for Dynamic Sentiment Analysis in E-Commerce Platforms. Journal of Machine Learning Research, 24(1), 30-45.

Li, M., & Wang, P. (2023). Robust Noise Reduction Techniques for Improved Sentiment Analysis in Retail. International Journal of Data Science and Analytics, 15(3), 123-140.

Ahmad, I., & Khan, M. (2023). Hybrid BERT-LSTM Architecture for Contextual Sentiment Analysis in Retail Data. IEEE Access, 11, 14589-14602.

Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J., Dhariwal, P., ... & Amodei, D. (2020). Language Models are Few-Shot Learners. Advances in Neural Information Processing Systems, 33.

Wu, X., Zhang, Q., & Liu, H. (2024). Reinforcement Learning for Dynamic Sentiment Analysis in Retail Data Streams. ACM Transactions on Information Systems, 42(2), 1-20.

Downloads

Published

20-12-2024

Issue

Section

Research Articles