You have probably already seen some articles like "A robot wrote an entire article. Aren't you scared yet, human?" So, who is the robot here? It's GPT-3 model. It's a transformer based language model. The full form of GPT is Generative Pre-trained Transformers. This model is developed by OpenAI. There were GPT-2 and other models released by OpenAI previously. GPT-3 was released in May 2020. GPT-3 is more robust than its predecessors. Though architecturally it doesn't have that mush difference. GPT-3 can write articles, poems, and even working code for you*, given some context. There are some limitations which I am going explain later in this article. It's a language model means given a text, it probabilistically predicts what tokens from a known vocabulary will come next in that string. So, it's sort of a autocomplete that we see on a phone keyboard. We type a word, and then the keyboard suggests another word that can come next. What sets GPT...
Artificial Intelligence | Data Science | Software Engineering