Is ChatGPT Actually Intelligent?. Maybe not… | by Lan Chu | Jul, 2023


Maybe not…

If you have been on any social media platform in the past months, I am sure you have heard about ChatGPT, Google Bard, Microsoft Bing and a myriad of new language models. All these new models, some can argue, are better writers than you and me and their English is definitely much better than mine 🥲 Every few years, somebody just invents something crazy that makes you totally reconsider what is possible. And in this article, we will be talking about the kind of invention that is rocking everyone in the world — yes, you guessed it — ChatGPT.

Image generated with Bing image creator. Artistic representation of human intelligence.

As we will increasingly rely on AI to do things for us, to make decisions for us, it is natural to ask whether AI is truly intelligent in the sense that its understanding of language mirrors our own, or that it is fundamentally different?

To make sense of it all, we are going to first look into how the Generative-Pretrained Transformer (GPT) and ChatGPT works, and then discuss what it means for an AI to be intelligent.

The GPT model, first proposed by OpenAI in their paper Improving Language Understanding by Generative Pre-Training, uses unsupervised pre-training followed by supervised fine-tuning on various language tasks.

Source: language_understanding_paper.pdf

The model architecture is based on Transformers, which have demonstrated robust performance in tasks such as machine translation and document generation. This model architecture was first introduced in the paper “Attention is all you need” by Google researchers. It provides an organised memory for managing long-term dependencies in text compared to recurrent neural networks and convolutional neural network, leading to better performance across a wide range of tasks.

It’s a prediction game

You can think of GPT model as a machine that is good at guessing what comes next. For example, if you give it the phrase “Instead of turning right, she turns…”, GPT might predict “left” or “back” or something else as the next word. How did it learn this? As you train a model on lots of text data, it learns how to…



Source link

Leave a Comment