AI - #4 (AI vs Human Brain)
While I was originally planning to continue the development of a simple AI LMM and was going to start down the path of merging these various AI data sets required to create a form of "Intelligence"; this is sometimes referred to as data fusion. Essentially running algorithms against the data sets to make sense or refine the information for a particular or desired purpose. This next step was to simply understand the various components of an AI system. However, I first wanted to tackle some definitions.
Before we continue on the current path of “demonstrating” how AI Data Models might work artificially in the "real" world … I wanted to explore a few more “general” definitions, just for fun.
Definitions – Source:Websters Online Dictionary
1- Artificial Intelligence or “AI”.
1: a branch of computer science dealing with the simulation of intelligent behavior in computers
2 : the capability of a machine to imitate intelligent human behavior
Notice how in the basic definition of AI above, there is no implication that AI machines are anything more than a poor imitation or simulation of the human brain. This is a far cry from claims of “General AI”, which I will explain below. In fact, even though current software and hardware technologies are smaller, better, faster (and cheaper) and more impressive than ever, what has been developed in the past is technically executed in pretty much the same way today - all machines are still based on machine instructions in a binary environment. Furthermore, the never ending battle of centralized (cloud) computing versus distributed (local or edge) has been ongoing since the invention of computers. I was there and worked on most of the platforms that were part of these evolutions from early 80’s thru today’s more modern architectures and on to mobile. Anyone that worked on mainframes, mini-computers, workstations, PC’s, mobile devices and dealt with dumb terminals, client server, cloud knows exactly that we really have not improved things that much since the 70's or 80's. Software has only incrementally improved. Hardware has made compelling advancements, but the foundations of CPU/GPU computing and storage technology is relatively the same. Even connectivity is based on the Internet Protocol, which was invented in 1974. While AI use a 1974 technology to work? Same year as the explodable Ford Pinto by the way, which is perhaps no coincidence?
2– Artificial General Intelligence (AGI):
AGI is a type of hypothetical intelligent agent.[1] The AGI concept is that it can learn to accomplish any intellectual task that human beings or animals can perform.
Emphasis in this definition is on “hypothetical”. The idea that compiled code can “learn” and “operate” like the human brain has been the holy grail of AI since the term was first coined at Dartmouth in 1956. A singularity, so to speak, where the machine and the man can not be differentiated - AND the machine can learn equal to or even better than that of the man. Creating a computer that could “model” a single human brain "sounds" straightforward. If we can just "build" the necessary or equal number of circuits and storage capacity of the human brain, in a machine, than that machine should be able to function in the similar or equal efficiency and function as the human brain. In reality, this is impossible today and likely impossible tomorrow; we are not even close. We are decades away, if not centuries from even being close. Simple and basic functions of the brains like sight, hearing, and speech can be imitated by machines to some degree, but can not be duplicated in capability or function to what a single brain does by it’s innate nature - and of course the brain develops itself overtime as a human grows. Software and hardware simple can not do this. How do I know this?
Next time you purchase something on Amazon, like 5 pairs of underwear. (or let that fact slip while your "smart" speaker is "listening" .... watch how you will be inundated with options to buy underwear, in funpaks, in glowing colors, at a discount, in plastic, in special delivery options, etc. etc. ad nauseam for the next 8 weeks. THIS IS AI!!
Next example, dial any customer service number and speak with friendly neighborhood robot. Before you throw that phone threw the wall, just hang up and save yourself a few bits of your data plan. So don’t worry, this crap is not taking over the world anytime soon.
Despite quantum and shared network computing, gazillions of MIPS, and software trickery, still mimicking a single brain is not in the realm of the possible today. Even very simple brains like mine or that of an amoeba, or any low-level forms of intelligence cannot be modeled successfully, even in simulation; Not in any working or "living, learning" form. This despite the vast amount of super compute power we have at our finger-tips - which by the way, does not diminish our accomplishments in this realm, which obviously, remain formidable.
Sorry folks, please don't be depressed that we won't have your favorite fantasy, intelligent sex kitten robots typing your term paper or building your next corporate powerpoint deck, while your stay locked in your Mom's basement immersed in your Vision Pro for days at a time. (and cooking your meals, writing your emails, and making sure the dogs get walked). Ok, so we won't have any fun with AI for quite a while, but let's see what it CAN do!