GPT-3 is a neural network trained by the OpenAI organization with more parameters than earlier generation models. The main difference between GPT-3 and GPT-2, is its size which is 175 billion parameters. It’s the largest language model that was trained on a large dataset. The model responds better to different types of input, such as a question & answer format, long-form writing, or human language translations (e.g. English to French). The large numbers of parameters make GPT-3 significantly better at Natural Language Processing and text generation than the prior model, GPT-2, which had only 1.5 billion parameters.

If you want to try GPT-3, then you need to request for API access to OpenAI beta. After waiting for long, we were finally lucky to get the approval of API access.

Since the launch, we have been seeing surprising demo applications and usage of GPT-3 so we were quite excited to try our hands on it. One of the main forte of GPT-3 is that, it can perform extremely well without any fine-tuning process. You just need to provide a few samples or context, and it will start working on that task.

In this blog, we will see how can we use GPT-3 for Intent Classification Task and text paraphrasing using OpenAI Playground which is accessible online from their website.

Intent Classification

Login to your OpenAI account.

Open the playground in OpenAI

Set the parameters according to your task, each task needs a different set of values, you can adjust these and try which one works best for your task. For intent classification, we have set the below parameters.

Now we will add the sample data, you can add the below given data or some other data of your choice.

Sample Data:

listen to WestBam album allergic on google music: PlayMusic
give me a list of movie times for films in the area: SearchScreeningEvent
show me the picture creatures of light and darkness: SearchCreativeWork
I would like to go to the popular bistro in oh: BookRestaurant
what is the weather like in the city of Frewen in the country of Venezuela: GetWeather

So, above is kind of context that we are giving to GPT-3 model from which it will learn and understand what type of result we are expecting. Now, it’s time to test it. For that, we need to provide one sentence for which we want to find intent.

For example, our test sentence is “I want to book a flight for Delhi:

To consider it as input, we just leave blank after colon (:) and GPT-3 will understand that it has to find the intent for the input.

After hitting Submit button, we can see that GPT-3 has successfully classified our input sentence as “BookFlight”!

The “Stop Sequences” lets GPT-3 know that where it should stop. In our case a new line is a stop sequence so GPT-3 will stop as soon as the classification is found.

The surprising part here is, in our samples, we didn’t even provide “BookFlight” classification but still GPT-3 classified it successfully. That’s the power of GPT-3!

You can also save this sample as a Preset which you can easily use in future. Click on the ‘Save Preset’ icon as shown below.

Give a name to your preset, and click on the “Save as New” button.

Once you save the preset, you can load it anytime you want.

You can also use this in your python code or make a curl request.

There is an icon of Code, clicking on which will provide you the sample python and cURL code that can be copied easily.

As GPT-3 is a hosted solution, you can access it via API. You don’t need to download or setup the model on your system.

Paraphrasing

Paraphrasing means you rewrite sentences that express the same meaning using a different choice of words. You can perform paraphrasing using any language model like BART, T5, pegasus in which you need to finetune the model but in that you have to fine-tune those pre-trained models on large dataset. Using GPT-3 you don’t need to finetune any model to perform this task. It’s quick and easy. Let’s see how we will do it.

Open the playground from OpenAI

Set the following parameters, you can even change these and test the result.

For this example, we are going to use 2 more parameters “Inject Start Text” and “Inject Restart Text”.

Inject Start Text will add the output text automatically after user’s input.

Inject Restart Text will append the input text label automatically after the output is generated so that the pattern of input and output will continue.

Now add examples in the following way as shown in the image.
Sample Data:

Article: Searching a specific search tree for a binary key can be programmed recursively or iteratively.
Paraphrase: Searching a specific search tree according to a binary key can be recursively or iteratively programmed.

Article: It was first released as a knapweed biocontrol in the 1980s in Oregon , and it is currently established in the Pacific Northwest.
Paraphrase: It was first released as Knopweed Biocontrol in Oregon in the 1980s , and is currently established in the Pacific Northwest.

Article: 4-OHT binds to ER , the ER / tamoxifen complex recruits other proteins known as co-repressors and then binds to DNA to modulate gene expression.
Paraphrase: The ER / Tamoxifen complex binds other proteins known as co-repressors and then binds to DNA to modulate gene expression.

Article: In mathematical astronomy, his fame is due to the introduction of the astronomical globe, and his early contributions to understanding the movement of the planets.
Paraphrase: His fame is due in mathematical astronomy to the introduction of the astronomical globe and to his early contributions to the understanding of the movement of the planets.

Test it by providing your input, Here we have added few samples below.

Test Data: Can I get a glass of water?

Result: May I get a glass of water?

Test Data: The U.S. President Donald Trump came to visit Ahmedabad first time at Motera Stadium with our Prime Minister Narendra Modi in February 2020.

Result: In February 2020, Donald Trump, President of the United States, came to Ahmedabad to visit the city and to meet the Prime Minister of India, Narendra Modi, for the first time.

Test Data: Google was founded in 1998 by Larry Page and Sergey Brin while they were Ph.D. students at Stanford University in California. Together they own about 14 percent of its shares and control 56 percent of the stockholder voting power through supervoting stock. They incorporated Google as a privately held company on September 4, 1998.

Result: Google was incorporated as a privately held company on September 4, 1998 by Larry Page and Sergey Brin, while they were Ph.D. students at Stanford University in California. They own about 14 percent of its shares and control 56 percent of the stockholder voting power through supervoting stock.

Here also, we get very good results! Though, the results are not consistent. We will get different result for same input each time we try and sometimes the result is not of a very good quality.


So, that’s our little experimentation with GPT-3. There are tons of possibilities to create various applications using GPT-3. We’ll keep exploring more and publish them on our site.


Do contact us at letstalk@pragnakalp.com if you are looking to integrate GPT-3 in your application. We would be happy to provide our consultancy services.

Categories: GPT-3 Natural Language Processing NLP

Leave a Reply

Your email address will not be published.

You may use these <abbr title="HyperText Markup Language">HTML</abbr> tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*