A massive leap forward
It announced in early 2019 that it had created a neural network for natural language processing called GPT 2.0, which it has released into the wild as open source.
OpenAI GPT is a language model written by Alec Radford. It is a generative model in which two neural networks acquire knowledge about the world by writing material and are used in a language competition where they perfect each other. It is the first time in the history of the OpenAI GPT project that two neural networks have acquired knowledge about this world by writing. Sources: 4
More details are available on the project’s GitHub and a few short learners are on arXiv.
OpenAI decided not to publish the text generator publicly because its creators worried that it would be used as an easy way for bad actors to spread a lot of fake news and propaganda. Instead, it concentrated on building a massive neural network and seeing what it learns, testing all sorts of theories. Sources: 6, 7
OpenAI GPT3 is the latest version of the open-source programming language GNU / Linux, supported by the Open AI Foundation, CEO Sam Harris, and a research team from the University of California, Berkeley. Sources: 2 Sam Altman explained that he believes that the GPT3 has a disruptive potential comparable to blockchain technology. Sources: 0, 7
GPT-3 has already surpassed itself with a new generative language model that is an order of magnitude larger than GPT-2. The larger version of the model has a much larger vocabulary and features than its predecessor but is still not quite as accurate. Sources:
We've found we can improve AI language model behavior and reduce harmful content by fine-tuning on a small, carefully designed dataset, and we are already incorporating this in our safety efforts. ht...Read More
We believe AI is the most interesting area for startups right now, and we want to fund ambitious founders building on our API.Read More
We're delighted that @HurdOnTheHill is joining our board of directors. His leadership and expertise will help us achieve our shared goal of building AI that benefits all humankind. ...Read More
A thought-provoking, interactive long read by @pamelamishkin, collaborating with GPT-3 to tell a very human story.Read More
Under the hood
GPT3 is based on a recently developed neural network architecture called Transformer. Transformer is a self-centered language that allows researchers to generate very compelling and coherent texts. The general language algorithm, uses machine learning methods to translate texts, answer questions and predict text. GPT-3 was trained off of 175 billion parameters from across the internet (including Google Books, Wikipedia, and coding tutorials) and has been fed 45TB of text data, an unprecedented amount.
In May, OpenAI released an API they developed that allows users to try out this new AI model for themselves. The API is designed to run models with weights of the GPT-3 family and compare them with weights of all other models in the family, to be easy for everyone to use and to make machine learning more productive. This model is required to consume a very large dataset with a total of more than 2.5 billion words and over 1.2 billion functions. Sources: 1, 3
OpenAI also announced an advanced voice processing algorithm. This artificial intelligence model made headlines for generating – mind-boggling kills and was unveiled to selected users in 2020.
This is a game-changer
The question now arises: What makes GPT 3.0 so unique, and can it produce useful content?
Given the 175 billion parameters, one might expect the GPT 3.0 model to be a little slow, but Max Woolf, a data scientist at BuzzFeed, writes that the OpenAI API was introduced on social media, but the potential pitfalls of the model and API were not highlighted. Max also points to the demo video and says that while there is no hardware challenge, even when training a large model, it can take some time for the models to get back to performance. Sources: 4
Max writes that no one knows how to use GPT 3.0 on OpenAI servers or how much it can scale. Sources: 4
The main question mark is cost, but I don’t expect it to be cheap. It is quite possible to turn a GPT-based start-up based on the OpenAI API into an economic entity. With that caveat, everything depends on how they finish the beta and put the API into production. I don’t even think about making money from it, let alone making it a business model for a start-up based on it. Sources: 3
In other words, you can send a sentence to the AI and it will answer you, but you should be careful – there is a good chance that you phishing, and that is what phishing is. API is designed to significantly reduce the barrier to the production of useful AI products, and thus produce tools and services that are hard to imagine today. Sources: 1, 6
GPT 3.0 can only answer questions that have been answered online, it cannot develop innovative solutions that require unique ideas, but it could eliminate the need to create variations of the same design, or to create simple web pages based on standard (and therefore unoriginal) principles. Sources: 1. It has a consistent mental model that generates its output word-for-word, just as it does in the original GPT, and can feed into machine learning systems.
GPT 3.0 has been in the news lately being praised by experts for its ability to write text and even code. Through a pre-trained training in Generative Neural Networks (GPT), you can generate written content based on just a few input words or sentences.
General purpose technologies are the ones that really transform the world and GPT-3 has the potential to do just that. It has the ability to write code from simple natural language inputs… this is a game-changer for humanity at the scale of discovering fire.
- https://maraoz.com/2020/07/18/openai-gpt3/ 0
- https://medium.com/@Synced/openai-unveils-175-billion-parameter-gpt-3-language-model-3d3f453124cd 1
- https://www.analyticsinsight.net/can-ai-write-article-complete-image-yes-says-openais-gpt-3/ 2
- https://vanrijmenam.nl/gpt-3-model-what-mean-chatbots-customer-service/ 3
- https://mspoweruser.com/what-is-gpt-3-and-how-it-will-affect-your-current-job/ 4
- https://www.newsbytesapp.com/timeline/science/63646/299302/all-about-openai-s-phenomenal-gpt-3-text-generating-ai 5
- https://thegradient.pub/gpt2-and-the-nature-of-intelligence/ 6
- https://singularityhub.com/2020/06/18/openais-new-text-generator-writes-even-more-like-a-human/ 7
- https://www.xcubelabs.com/blog/open-ais-gpt-3-the-artificial-intelligence-creating-all-the-buzz/ 8
- https://www.wired.co.uk/article/gpt-3-openai-examples 0
- https://nesslabs.com/gpt-3-future-productivity 1
- https://venturebeat.com/2020/07/24/ai-weekly-the-promise-and-shortcomings-of-openais-gpt-3/ 2
- https://minimaxir.com/2020/07/gpt3-expectations/ 3
- https://analyticsindiamag.com/gpt-3-is-great-but-not-without-shortcomings/ 4
- http://jalammar.github.io/illustrated-gpt2/ 5
- https://diagram.news/text-summary-on-demand/ 6