September 9, 2021 No Comments

Experimenting With OpenAI Codex

If AI can write articles then why can’t it write programming code? 🙂 Well, it can! OpenAI Codex is an NLP (Natural Language Processing) model that translates natural language into code. It is a direct descendant of GPT-3 that has been trained on both natural language and billions of lines of code. It is the strongest in […]

May 31, 2021 No Comments

How To Implement Semantic Search Using OpenAI GPT3

Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model used for text generation created by OpenAI. GPT-3 showed the amazing potential for a really smart language model to generate text and has the ability to do amazing task such as Question-Answering, Summarization, Semantic Search, Chatbot, Writing poetry, or an essay. Among them, we have […]

March 28, 2021 No Comments

Question and Answering System Using GPT3

Since OpenAI launched GPT-3, we have been seeing numerous applications with various functionalities developed using GPT3. Recently GPT-3 added new feature of Question Answering system which we took for a spin to check how it works. In our experimentation with small data, the system looks pretty promising. It is fetching answers quite accurately from small data. In future, […]

January 19, 2021 No Comments

Intent Classification & Paraphrasing Examples Using GPT-3

GPT-3 is a neural network trained by the OpenAI organization with more parameters than earlier generation models. The main difference between GPT-3 and GPT-2, is its size which is 175 billion parameters. It’s the largest language model that was trained on a large dataset. The model responds better to different types of input, such as a […]