If AI can write articles then why can’t it write programming code? 🙂

Well, it can!

OpenAI Codex is an NLP (Natural Language Processing) model that translates natural language into code. It is a direct descendant of GPT-3 that has been trained on both natural language and billions of lines of code. It is the strongest in Python and works great in over a dozen other languages, including JavaScript, Go, Perl, PHP, Ruby, Swift, TypeScript, SQL, and even Shell.

Codex was created to accelerate the work of professional programmers, as well as to help newcomers get started in coding.

Codex can be used for a variety of purposes, including:

  • Convert your comments into code.
  • Complete your next line or function in the context
  • Providing useful information such as finding a useful library or API call for an application
  • Adding comment
  • Rewrite code to make it more efficient.

We can use comment, data or code to get Codex generate code for us.

Just by providing information about what we need to code in the comment section, Codex will write that program or function.

We did some experiments and the results were very pretty good if the input provided was understood properly by Codex. So, we have to be careful in drafting the input. The output depends heavily on it.

Let’s go through some of the successful experiments we did.

In our experiment, we took the generated code and tried running it it local system or on Google Colab to make sure that the code works as expected. Below we have provided the input, generated code and the output of the generated code.

(Currently Codex is still in beta and public access is not available. Join the waitlist by sending the request here. If you already got GPT-3 access then it would be faster to get access. )

Experiments

To start using Codex yourself, try the given examples in the playground. Make sure you select the Engine as davinci-codex as shown.

Sample 1:

In this sample experiment, we have tried to create a Question Answering System using the transformers pipeline.

Input Prompt:

"""
Use transformers pipeline to create Question Answering System
"""

Code Generated by Codex:

We tested the above code on Colab and were pleasantly impressed by how well it performed. The code really worked without any changes.

Output of generated code:

Sample 2:

Here, we have tried to send an email with a message.

Input Prompt:

"""
Python program that sends an email to pragnakalp.dev7@gmail saying "Hello, We are live now" from pragnakalp.dev17@gmail.com.
"""

Code Generated by Codex:

Output of generated code:

Sample 3:

In this experiment, we have tried to generate a code to scrap a web page. It worked pretty well.

Input Prompt:

"""
Write a program to scrap yahoo.com homepage using beautifulsoup
"""

Code Generated by Codex:

Output of generated code:

Sample 4:

We have even tried to create NER (Named Entity Recognition) using the spacy.

Input Prompt:

"""
Use Spacy large model for Named Entity Recognition
"""

Code Generated by Codex:

Output of generated code:

Sample 5:

After the experiments with Python Language, we tried our hands-on HTML, and it works great. We have built a Registration webpage.

Input Prompt:

<!-- Create a registration web page using Bootstrap -->
<!DOCTYPE html>

Code Generated by Codex:

Output of generated code:

Sample 6:

We even tried to scrap the webpage using selenium. We have tried this code in the local system.

Input Prompt:

"""
Write a program to scrap yahoo.com homepage using selenium and beautifulsoup
"""

Code Generated by Codex:

Output of generated code:

You just need to change the webdriver path. In executable_path, specify proper path of chromedriver and make sure that your chromedriver version is compatible with your browser.

Failed experiments

The experiments which we have seen above all worked quite well. While trying to get a good response, we have also faced that some prompts didn’t work well. Writing the prompt affects the quality of the result. For example, to create NER with spacy we have tried the below prompts. And it didn’t work correctly, it just generated import statement lines up to 1600 tokens.

"""
Use Spacy large model for NER
"""
"""
Use Spacy large model and create NER
"""

So, writing a proper prompt is of high importance to get desired output from Codex (which is true for GPT-3 models as well.)

Hope you liked our experiments. If you would like to try something more then do send your request in comment. We’ll try and post the results.


We offer Natural Language Processing consultation services. Do reach out to us at letstalk@pragnakalp.com for any GPT-J, GPT-3, BERT or NLP related project.

Categories: GPT-3 Natural Language Processing NLP

Leave a Reply

Your email address will not be published.

You may use these <abbr title="HyperText Markup Language">HTML</abbr> tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

*