Last updated on May 22nd, 2021 at 06:31 am
Texting messages become easy with autocomplete suggestions in your keyboard. But if I say you have to write just a heading and I have a machine which will generate a whole new text for that heading that has never been published or written by anyone!
It’s been quite sometime after the official release of OpenAI API for its new AI model, GPT-3. which is the Generative Pre-training Transformer, the third generation of the machine learning model! The Internet has a lot of buss about GPT-3. Twitter is flooded with its interesting and peculiar use cases.
What is OpenAI?
Elon Musk, Sam Altman, and four colleagues co-founded OpenAI in 2015. OpenAI is an artificial intelligence research laboratory. It has released multiple research projects as OpenAI Gym, OpenAI Universe, etc in recent years for reinforcement learning algorithms, gaming, websites, and other applications.
The recent announcement of OpenAI is GPT-3, which is a language model trained over trillions of words from the Internet. You can perform English language tasks for now. It is far better than the previous generative models GPT and GPT-2. The number of training parameters is huge i.e. close to 175B trainable parameters.
GPT-3 works on the basis of “Text In Text Out”. You have to feed some of the inital text and based on that text, it will predict and generate further text.
Satoshi Nakaboto said, “OpenAI’s GPT-3 may be the biggest thing since bitcoin.”
What are the present features and use cases of GPT-3?
With such a huge number of training parameters, GPT-3 has extended the keyboard word prediction to a whole new level, at a much larger scale.
Using GPT-3 you can create amazing creative fiction. Gwern Branwen has shared a detailed blog over GPT-3 generated creative fiction. He has shared varied fiction including, dialogue, humor, poetry, poems, criticism, and much more.
=GPT3()… the spreadsheet function to rule them all.— Paul Katsen (@pavtalk) July 21, 2020
Impressed with how well it pattern matches from a few examples.
The same function looked up state populations, peoples’ twitter usernames and employers, and did some math. pic.twitter.com/W8FgVAov2f
One tweet from Lawder displayed the image recognition capacity of GPT-3. He scanned the ingredients of a product and got a detailed list of all the ingredients.
Turns out #GPT3 can do vision too 😉— Lawder (@lawderpaul) July 19, 2020
Built an ingredient parser: take a pic of any nutrition label (google to extract text), and GPT-3 will identify ingredients, find an emoji, determine if it’s unhealthy, and give a definition 🤯 pic.twitter.com/ohBQI3qki3
You can build your own codes and programming using GPT-3. Shareef Shameen has built a layout generator, which can generate the JSX code when given the inputs, using the GPT-3 API.
This is mind blowing.— Sharif Shameem (@sharifshameem) July 13, 2020
With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you.
W H A T pic.twitter.com/w8JkrZO4lk
GPT-3 / @OpenAI is starting to feel like “Google as an API”.— Guillermo Rauch (@rauchg) July 12, 2020
Not as a search engine, but as a general intelligence.
Not for personal use, but as a building block for a new category of apps.
Very exciting. https://t.co/0unWmDRwTb
Guillermo, in his tweet compared GPT-3 as “Google as an API”. I am actually more concerned about this. It’s been two decades since we are using Google as a search engine but somehow we got another google to find something that has not been written before. This is the next level test case for GPT3 SEO as well as GPT3 content writing.
I am most fascinated by its content generation capacity and how it will change the job landscape in the future for the content writers. As the machines have replaced the factory workers, will this replace content writers in the same way?
Let’s hope for the best. Drop your comments and share if you like it.
Please Subscribe to my newsletter for getting more such updates.