There is, probably, nothing that the world’s tech communities would discuss more over the last couple of weeks than GPT-3 - a language generation tool that is said to know both human and computer languages. While the invention’s features have not yet been tested to their full capacities, the tech enthusiasts worldwide are already launching a variety of startups based on GPT-3, thinking that the model would become a full-fledged substitute to developers in the nearest future. Even entrepreneurs and the experienced developers themselves believe that GPT-3 is where the future of programming and text-generation lies. The WIRED Magazine has recently shared the story of Sharif Shameen, an entrepreneur from the U.S., who has tested GPT-3 for its code-generation aptitude. Having created a short description of the required app - he wanted a simple to-do list program that would tick off the completed tasks - he submitted it to GPT-3 and voila  - a functioning code was ready to use within a matter of minutes. It is not only Mr. Shameen, who seems to be ‚petrified’ in the face of opportunities that GPT-3 reveals, as the whole tech world appears to be standing on the brink of a massive revolution. GPT-3 has swallowed the entire Internet, as some folks on social media might claim, including the programming tutorials. Hence, it may seem like OpenAI’s brainchild is on its path to substituting for copious human beings in their workplaces, as it can automate the majority of code and text writing processes. Is it truly so? Let’s find out!

Quick to Impress: Breakthrough AI-Language Tool

 The beta version of GPT-3 was launched in June 2020, and it took the world of IT by storm. The AI generator went viral within a matter of days. Investors and entrepreneurs from all around the world started proclaiming it as the future platform for their businesses to thrive on. Akin users have swiftly overfilled Twitter with poems, memes, short stories, and even guitar tabs written by GPT-3. It was actually the point of the biggest interest for OpenAI - to see what tech experts with no specific prowess in the AI domain would be able to do with GPT-3. The results cannot be classified as unexpected, but yet, there were some points of sheer interest and surprise. Nobody expected GPT-3 to be able to transform verbal command into visual elements created with the help of a code. The very same Shariff Shameen has supplied that AI engine with a prompt saying: «buttons looking like a watermelon,» and the language generation model started coding a pink round shape with dark green borders. As soon as the word watermelon went viral as a synonym to thousands of potentially lost jobs in the IT sector, some of the world’s most influential investors claimed that the risk of massive unemployment is justified. Delian Asparouhov - the one who supported SpaceX and Facebook at the very beginning of these giants - claims that GPT-3 is going to be huge for developing and refining the health care system. He claims that it might automate a great many of the processes, thus saving billions of dollars to be spent on research. The model’s technical characteristics are truly impressive, as GPT-3 has been infused with 175 billion parameters, which is ten times more than any other autoregressive language model that has ever existed before. For example, GTP-2 was also able to provide semantically complete bits of text as soon as it was given the first statement parameter. However, GPT-3 is going to be way better, as its predecessor consisted of only 1.5 billion parameters. The tool is already capable of producing text bits written in the style of the world’s most famous authors. Mario Klingemann, an artist, working closely with machine learning, has already used GPT-3 to generate «The Importance of Being on Twitter,» a short story written the way Jerome K. Jerome would have done it. It has already been proved that GPT-3 can perform a number of exciting tasks, and it can even make our world a bit more obedient by rephrasing rude comments on social media. As a matter of fact, the model can be used for generating any kind of text, as creating web page layouts is a task no harder for GPT-3 than writing a short story. The potential it holds is impressive, and OpenAI plans on turning the tool into a full-swing commercial product by the end of 2020. So, what do we have? As of now, GPT-3 is was more than a mere NLP model, but a:
  • Promising text generation tool to be used in commercial and technical domains;
  • Up-and-coming coding agent that is capable of creating custom apps based on the verbal commands and instructions.
What is GPT-3 and what to expect from this tool
Is GPT-3 a Breakthrough AI-Language Tool or Overhyped Bling?

Swift to Depress 

 However, while using GPT-3 seems to be nothing but a step made right into the future where artificial intelligence is being used ubiquitously, OpenAI themselves claim that there is still a lot of work to be done in order to refine this Natural Language Processing model. It is still quite raw, and developers from all around the world are welcome to join the GPT-3 exploration process. Despite achieving impressive results on extensive NLP databases, including the tasks connected with answering questions, writing and translating texts, and spot-on reasoning, GPT-3 still requires improvement when it comes to talking about the methodologies it uses to auto-train itself on extensive web data. Regardless of how groundbreaking and revolutionary GPT-3 might seem, there is a need to understand that intelligence, even an artificial one, is something that can feel the context and syntax of the language. Such a feature is of the utmost importance in both stances: when writing a text that must meet the expectations of a particular target audience or when coding an app that must meet the customers' requirements that are never the same. It should be acknowledged that GPT-3 is surely not that flawless as it seems to be. It still cannot flee from sexism and racism in the texts it generates, and it can also turn polite comments into insults, as well. While its coding abilities seem to be truly impeccable, it cannot be classified as a pure manifestation of artificial intelligence. Nowadays, it is still too soon to be talking about GPT-3 and the things that happen inside this NLP model. Of course, its ability to swallow coding tutorials and implement them in practice is incredible. Still, it lacks the syntactic sensitivity for both coding and human languages, which, when it comes to building an understandable and usable text or code, is nothing a crucial element. 

Let’s Sum It Up

 OpenAI’s achievement is surely something of a paramount importance to today’s tech industry, and it is going to be big shortly. Yet, as of now, we all need to cool down and start thinking rationally once again. Sam Altman - the man who has created OpenAI with Elon Musk - claimed that the mistakes GPT-3 makes still should outweigh the compliments it gets from people being able to see only the top tier of the tool, while the wider picture is still quite blurry to perceive. Anyway, the greatest trick that AI has ever pulled with humanity is making people believe that it actually exists.