The amazing world of GPT-3

No-Code Jan 25, 2021

We live in a world where humans are surrounded by machines. We rely on them for many things, but they also limit us to their functions and domains. Artificial intelligence is becoming more prominent as time goes on, which has led to the development of GPT-3: an open source natural language processing library that is now the most powerful NLP ever built by humans.

NLP is critical to incredible innovations such as chatbots, voice assistants like Siri, and translation sites. GPT-3 will provide an increased understanding of various linguistic features. It can process text in 100+ languages, something that its predecessors couldn’t do--potentially making it the preeminent NLP in use by humans today.

And, as a matter of fact, the last two paragraphs were done completely by GPT-3, using Headlime. Astounding, right?! Not only was it very well written, but it also saved me from having to write about—and in order to do that, understand deeply—the technical stuff. Sure, maybe it has kind of a neutral tone, but this was done in a matter of seconds and I didn't even try to convey tone. If I trained the model with some tone examples, the results surely would be different.

But back to GPT-3, we are facing some serious leaps in technology. This model, created by Open AI, has 175 billion parameters. 175 BILLION! GPT-2, its predecesor, had 1.5 billion, and the next in line, Microsoft's Turing NLG, has 17 billion. But what does that mean? What do we mean when we talk about parameters? Well, this type of models, the NLP, are trained with a huge amount of written words from the internet. And all these words generate parameters. The parameters is the way in which we are able to measure the complexity of these models and how accurate it can be. So if it has 175 billion, well, you can imagine, it's advanced stuff.

In fact, GPT-3 excels with text. It can write creatively, it can chat with you realistically, it can summarize text for you —there is a demo in which someone uses the model in a wikipedia entry of Bread, and asks it why bread is so fluffy, and the model searches the entry and points to where it is explained, even when the exact words don't match. It will make today's assistants, like Siri or Alexa, seem very limited. Because, well, compared to this, they actually are.

Of course, it isn't perfect, and it is actually in Beta testing right now —about 19000 people are using it. There are things about common sense with which it can struggle and, being as powerful as it is, it is dangerous that this falls in the wrong hands, as it could, for example, encourage hate speech or produce tons and tons of fake news. So there's that caveat. But of course, they are working on it. And we hope that when it is available to public, we see it being used only, or at least for the most part, for the betterment of humanity.  

I can't help but to imagine this model turning into a kind of C3PO, in the funniest of cases, a Samantha from Her, in the weirdest of cases, and a HAL 9000, in the worst of cases. Either way, it is literally the stuff from movies and with each day it gets closer and closer. It is exciting, it is scary, but mostly, it is amazing to see where we are and to wonder what awaits us in the future.

And who knows, perhaps there's a way in which GPT-3 and No-Code could work together. We'll let you know!

const headerTagLinks = document.querySelectorAll('.js-header-tag-link'); for (x = 0, l = headerTagLinks.length; x < l; x++) { const lang = headerTagLinks[x].getAttribute('data-slug').split('-')[1]; const shouldRemoveLink = lang !== document.documentElement.lang; if (shouldRemoveLink) { headerTagLinks[x].remove(); } }