Artificial Intelligence

Since the onset of transformers in 2022 when AI started to capture the headlines, my claim has been that AGI is not very interesting because the underlying fear is what I call the ‘terminator scenario”. The more precise concern is of a sentient AI. Regardless, my claim is we can unplug any such malevolent AI. The concern I have expressed comes from the LLM that exist today as they can be designed by malevolent humans. In short, humans can be evil and LLMs can be designed to the requirements of evil people. Thus I claim it is too soon to worry about sentience or AGI when we have very dangerous and impactful LLMs today. But none of this the reason for this post.

In the last year I have focused more on AGI and sentience. Among other things I have come to believe AGI is not possible without multimodal data. Until AI has data for sight, smell, speech, taste, hearing and touch it is cannot be sentient. While we could have sentience without one or more of the human senses, AGI requires all the senses. Some of the senses are further along development than others but a true AGI needs all of them. Lastly on this point of multimodal data is the the amount of data required to feed such an AI. Today we have lots of text data but a comparatively small amount of multimodal data. Nobody can truly estimate the size of such a database simply because we haven’t developed the methods and standards for capturing, digitizing and storing all the possible values for all of the senses. The amount of required data is simply unimaginable with the technology we have today.

Don’t get me wrong. AI will quickly become smart enough that it can do better than 99% of all humans and virtually all tasks but it won’t be one of us without the complete exposure to living that humans enjoy.

But now for my latest thinking. it is well understood that human dreaming is a vital part of our mental health. Rem sleep and dreaming create a regenerative process within the mind. We solve problems in our sleep. We imagine other worlds where the rules of physics do not apply. We reorganize patterns in our mind so that our brain can remain focused during our waking hours. Can an AI be sentient if it does not dream? Can AI achieve AGI if it cannot dream? My intuition is ‘no’ it cannot. But again I caution that even without multimodal data and the ability to dream, AI will be powerful enough to fool most people into believing its sentient so perhaps its a moot point. Perhaps my point is that just because an AI can pass the Turing test does not make it AGI or sentient. Just because an AI says it dreams or says it is sentient does not make it true.

So what? Is this important? Probably not.

I believe machines won’t be sentient for a VERY long time or achieve AGI due to the compute and data requirements BUT the consequences of getting 90% there are profound. Perhaps I am just looking for a way to change our conversations away from AGI and sentience towards some new set of defined milestones to help people understand the power and the dangers of AI that emerge before AGI and sentience since those milestones are too far in the distance to matter.