Development of a Web Application for Intelligent Analysis of Customer Reviews Using a Modified seq2seq Model with an Attention Mechanism

封面

如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

Machine learning, and neural networks in particular, are having a huge impact on business and marketing by providing convenient tools for analytics and customer feedback. The article proposes an intelligent analysis of customer feedback based on the use of a modified seq2seq deep learning model. Since the basic seq2seq model has a significant disadvantage – the inability to concentrate on the main parts of the input sequence, the results of machine learning may give an inadequate assessment of customer feedback. This disadvantage is eliminated by means of a model proposed in the work called the “attention mechanism”. The model formed the basis for the development of a web application that solves the problem of flexible interaction with customers by parsing new reviews, analyzing them and generating a response to a review using a neural network.

全文:

受限制的访问

作者简介

Evgeny Budaev

Financial University under the Government of the Russian Federation

编辑信件的主要联系方式.
Email: esbudaev@fa.ru
ORCID iD: 0000-0002-3718-0282

Cand. Sci. (Eng.), Associate Professor, associate professor, Department of Data Analysis and Machine Learning, Faculty of Information Technology and Big Data Anlysis

俄罗斯联邦, Moscow

参考

  1. Labintsev A., Khasanshin I., Balashov D. et al. Recognition Punches in Karate Using Acceleration Sensors and Convolution Neural Networks. IEEE Access. 2021. Vol. 9. Pp. 138106-138119. doi: 10.1109/ACCESS.2021.3118038.
  2. Dipankar Das, Anup Kumar Kolya, Abhishek Basu, Soham Sarkar. Computational intelligence applications for text and sentiment data analysis. Academic Press, 2023. P. 252. ISBN: 9780323905350. doi: 10.1016/B978-0-32-390535-0.00007-0.
  3. Gated recurrent unit networks: Efficient neural architecture for sequential data. URL: https://askanydifference.com/gated-recurrent-unit-networks/ (data of accesses: 14.12.2023).
  4. LSTM и GRU. URL: https://habr.com/ru/companies/mvideo/articles/780774/ (data of accesses: 03.12.2023).
  5. NLP from scratch: machine translation using seq2seq with the attention mechanism. URL: https://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html#loading-data-files (date of application: 12/01/2023).
  6. Sansano E., Montoliu R., Belmonte Fernández Ó. A study of deep neural networks for human activity recognition. Computational Intelligence. 2020. No. 36. Pp. 1113–1139. doi: 10.1111/coin.12318.
  7. Shahriar Akter, Saradhi Motamarri, Umme Hani. et al. Building dynamic service analytics capabilities for the digital marketplace. Journal of Business Research. 2020. Vol. 118. Pp. 177–188. ISSN: 0148-2963. doi: 10.1016/j.jbusres.2020.06.016.
  8. Introduction to Text Mining and preprocessing of text data. URL: https://nagornyy.me/it/vvedenie-v-text-mining-i-predvaritelnaia-obrabotka-tekstovykh-dannykh / (data of accesses: 05.12.2023).
  9. Introduction to deep learning: a step-by-step guide. URL: https://pythonist.ru/vvedenie-v-glubokoe-obuchenie-po-shagam/? (data of accesses: 02.12.2023).
  10. Visualizing neural machine translation (seq2seq models with an attention mechanism). URL: https://habr.com/ru/post/486158/ (data of accesses: 03.12.2023).
  11. How to display a product card in the top on Wildberries. URL: https://ritm-z.ru/blog/Ritm-Z/kak-vyvesti-kartochku-tovara-v-top-na-wildberries/ (data of accesses: 02.12.2023).
  12. Makarova E.S., Budaev E.S. Intelligent analysis of customer reviews using the seq2seq model. Internauka: Electron. Scientific Journal. 2022. No. 22 (245). URL: https://internauka.org/journal/science/internauka/245 (data of accesses: 14.12.2023). doi: 10.32743/26870142.2022.22.245.342556
  13. Marketplaces: definition, types, sites. URL: https://investprofit.info/marketplace/? (data of accesses: 02.12.2023).
  14. Mechanism of attention. URL: https://neerc.ifmo.ru/wiki/index.php?title=Mechanism_of_attention (data of accesses: 12/16/2023).
  15. Models of deep neural networks sequence-to-sequence on PyTorch. URL: https://habr.com/ru/post/567142/ (data of accesses: 12/17/2023).
  16. Recurrent GRU blocks. An example of their implementation in the task of sentiment analysis. URL: https://proproprogs.ru/neural_network/rekurrentnye-bloki-gru-primer-realizacii-v-zadache-sentiment-analiza? (data of accesses: 14.12.2023).

补充文件

附件文件
动作
1. JATS XML
2. Fig. 1. The basic architecture of the seq2seq model]

下载 (3KB)
3. Fig. 2. Classical architecture of GRU

下载 (50KB)
4. Fig. 3. Architecture of the seq2seq model with an attention mechanism

下载 (40KB)
5. Fig. 4. The organization’s goals tree

下载 (37KB)
6. Fig. 5. Diagram of use cases “Selling cosmetics on the marketplace”

下载 (96KB)
7. Fig. 6. Algorithm for the server part of the program

下载 (37KB)
8. Fig. 7. Part of the training sample]

下载 (137KB)
9. Fig. 8. The result of neural network training]

下载 (77KB)
10. Fig. 9. The algorithm of the server side of the application with the indication of frameworks and libraries

下载 (152KB)
11. Fig. 10. Part of the upload for testing the health of the server side of the application

下载 (265KB)
12. Fig. 11. The result of the program’s operation when testing the health of the server part of the application

下载 (22KB)
13. Fig. 12. Generated file

下载 (446KB)


##common.cookie##