Closed-domain BERT based Chatbot

If you are looking to set up the same demo like our Closed Domain Chatbot Using BERT, we can provide you with the code, fine-tuned model (PyTorch-based), and all required setup instructions with nominal charges.

English Version

  • Source Code of Python + Flask (PyTorch based)
  • BERT Model Fine-Tuned on SQuAD 2.0 English Dataset
  • Easy to set up code

USD $299 Only

Other language

  • Source Code of Python + Flask (PyTorch based)
  • BERT Model Fine-Tuned on thousands of questions for the language you purchase
  • Languages: Hindi, Arabic, Spanish, French, German, Chinese Simplified
  • Easy to set up code

USD $399 Only

Disclaimer
  •  Please test the result thoroughly on the demo for the language you are planning to purchase.
  • The answers may slightly vary from the demo as the fine-tuning of the models was done with different computational resources.
  • You agree not to resale the code and fine-tuned model or distribute them freely. It is for personal use only.
  • The performance of this model is improved so it can be used in some projects where the length of the text is limited.

The below table shows how the prediction time changes with the BERT English model and BERT Multilingual Model on different configuration systems.

  • 4 Cores CPU & 15 GB RAM (Google Cloud)
  • 8th Generation i3 processor & 8GB RAM (Local System)
  • 1 x RTX 2080 Ti (GPU)

Configuration

English

Multilingual

4 Cores CPU & 15 GB RAM (Google Cloud)

3-4 Seconds

1-2 Seconds

8th Generation i3 processor & 8GB RAM (Local System)

1-2 Seconds

Within a Second

1 x RTX 2080 Ti (GPU)

Few Milliseconds

Few Milliseconds

NOTE: Prediction time can also vary by tuning the Hyper-parameters like Batch-Size or Max_seq_length, but you need to be careful when making changes in hyper-parameters as it can also affect accuracy.

Interested to purchase? Please email:

letstalk@pragnakalp.com