September 4, 2020 No Comments

Successful brands always focus on delivering the highest customer experience or in other words the certain brands are successful because they always focus on improving customer experience. And to do so, the brand frequently needs to engage in measuring brand perception. Sentimental analysis is the best tool to analyse all reviews to confirm whether customers are happy or not with the product or services. It can used to analyse movie reviews, customer feedback or general tweets. Basically, the sentimental analysis classifies reviews in different classes like a positive review or a negative review. It reads the emotions behind any sentences and based on that emotions classify them.

BERT is the state-of-the-art model for NLP task with much better accuracy and solution to many different NLP problems. So here we have tried this BERT model for the sentimental analysis task.

BERT Introduction

When we came to know about opensource BERT (Bidirectional Encoder Representations from Transformers) model developed by Google researchers, we started implementing for many Natural Language Processing task and it’s giving State-of-the-art results. In any NLP task, the main task is to train the base language model on large dataset and that is a big task and needs high computational power. So we decided to use this open-source BERT model with 110M parameters. 

Fine-Tuning of BERT for sentimental task:

TensorFlow Hub is a library containing the trained models of machine learning. We have used the BERT-base uncased pretrained model available at the TF Hub with 110M parameters, 12 layers, 768-hidden, and 12-heads.

To achieve the task of classification, we have fine-tuned the pre-trained model. In fine-tuning, we have removed the last layer of the model and added our new neural network layer that classifies the text in two classes. 

Dataset

The Base BERT model was pre-trained on Wikipedia and Book Corpus, a dataset containing 10,000+ books of different genres. So it gives benefits of pre-training of model on a much larger dataset which improves the ultimate results.

The pre-trained model can be fine-tuned on the dataset of sentimental analysis resulting in substantial accuracy improvements compared to training on these datasets from scratch. In the fine tuning task we have used the dataset of the IMDb Large Movie Review hosted by the Stanford. This is binary classification dataset contains 25,000 movie reviews for training and 25,000 for testing. 

Demo of BERT Based Sentimental Analysis

So that the user can experiment with the BERT based sentiment analysis system, we have made the demo available.

Try our BERT Based Sentiment Analysis demo.

Give input sentences separated by newlines. Due to the big-sized model and limited CPU/RAM resources, it will take a few seconds. Kindly be patient. In the results, it will give you sentiments for input sentences.

Do check out with demo and do let us know your feedback on BERT based sentiment analysis of the text in the comment section. 

Accuracy/Results and Further roadmap:

The validation accuracy we gained after training of the model on the IMDb sentimental analysis dataset is 89.29%.

As with any Machine Learning based model, it’s not necessary that the model will always give correct answer. Sometimes it gives wrong sentiments for the given input sentences. We would like to further fine-tune to improve its results.

We will also try to fine-tune this model on different large sentimental analysis dataset available and will check if it improves the results.

Feel free to comment your doubts/questions. We would be glad to help you.

If you are looking for Chatbot Development or Natural Language Processing services then do contact us or send your requirement at letstalk@pragnakalp.com. We would be happy to offer our expert services. 

Write a comment

Your email address will not be published. Required fields are marked *

Want to talk to an Expert Developer?

Our experts in Generative AI, Python Programming, and Chatbot Development can help you build innovative solutions and scale your business faster.