Artificial Intelligence is neither.
Let me explain. There’s been growing interest and commentary in the last several years on Artificial Intelligence (AI) and as a wannabe writer, relapsed polyglot, software engineer in remission and AI enthusiast, I would like to share some insights, stop budding AI luddites in their tracks and start a movement to reduce popular use of terms such as AI and Machine Learning (ML) – because, well, they’re neither. For the sake of clarity I will continue to use those terms in this article.
Will we create true Artificial Intelligence (or Sentient) systems? The answer to that question, as to any other is, yes, with time, energy and resources. If we can imagine it, we can make it happen. But we’re definitely not there and don’t have the hardware or software for it. It’s zeroes and ones, and even if it’s fuzzy logic or quantum compute, it’s still simulated via zeroes and ones. Intelligence isn’t bound to binary.
Today’s AI systems are typically (but not always) statistical model number crunchers taking advantage of utility compute power and freely available oceans of data to find patterns and trends that meet some desired outcome, in the hopes of being able to predict those desired outcomes. One great example of this is ThyssenKrupp’s MAX Predictive Elevator Maintenance Program. It’s a bunch of some wicked statical math and some really cool modelling. There are, of course, other types of AI systems – as mentioned below.
In fact, nearly every example you can think of today isn’t intelligence – it’s really not. Even when it looks like it. For example, autocomplete – isn’t intelligent or smart (but it is cool!). Something like autocomplete is number crunching through immense amount of data (thanks Cloud Compute and everyone who accepted EULA’s without reading them) to generate likely outcomes.
But it learned what you really mean to type when it autocompletes duck!
What it did was notice that you kept changing duck to puck when texting with your hockey buddies. Then duck & puck to cluck with your 4H alumni. Then duck & cluck to puck back again. So now, sure, it seems like it’s learning and sure it seems smart because now it knows to send puck to Wayne and cluck to Jimmy instead of the ever confusing duck to exactly no one ever. But was that a process of learning? Pedantically, yes, but it’s not really learning when you’re applying it to a narrow problem in a single domain. What you’re doing is (math & models!) changing desired outcomes based on who you’re interacting with. There is a discrete, bound box it operates in. There is no knowledge or skills acquisition.
Are there companies that have\are creating AI platforms that you can use? Absolutely! After you train them (lots of data!) and specify (bias) towards your ideal outcomes, they’re a cost effective on-ramp to leveraging the power of these tools. They’re still not AI, and it’s a stretch to call them ML. Look behind the curtain and you’ll find … Oscar Diggs! Same thing. But what a fantastic business model. Now, if I can take one of those systems trained for predictive maintenance for Sunbelt’s construction equipment (made that up) and drop it into Pinch-a-Penny’s pool proactive summer maintenance program (made that up too) and it remembers both … ? Well it’s not going to – and really, nor should it because these are tools and platforms we use to solve problems in a specific domain. And because these systems cannot inherently gather knowledge and skills, it’s not quite right to call them AI or ML systems. Those platforms are great for democratizing those tools, but they’re still neither AI nor ML.
But they’ve created Art! Music! Faces! Again, let’s think about this. They had to have a massive set of data to ingest, real world approximations and a model (there’s that word again) to work within to create something that sounds or looks better than what a room full of monkeys smashing away at keyboards, easels and instruments will create.
Which, if you really think about it, is sort of like the real creative process only with less discordance and poo flinging. I’m pretty sure that’s not how Da Vinci, Prince or Austen created.
So what should we call them? Well, Enormous Cloud Compute Assets Statiscally Calculating and Modelling Things is a bit of a mouthful. And there’s a collection of nicely descriptive names for simulacra “AI” systems such as Expert Systems, Neural Networks, Genetic Algorithms, Blackboards, etc. But they’re still, generally, the same thing at the end of the day. Massive gobs of data, statistical or simulated modeling or meta/heuristics to achieve some desired (and therefore previously known) optimal solution.
I propose we just abbreviate it all down to Simulated Modeling (SM).
Because, Artificial Intelligence, is neither.
(… with apologies to our future sentient machine overlords, please don’t skynet us)