Artificial intelligence (AI) vs. machine learning (ML): 8 common misunderstandings

IT and business leaders will run into some false notions about artificial intelligence and machine learning and what each one can do. Here's how to articulate the truth on AI vs. ML
364 readers like this.
AI artificial intelligence

Some people use the terms artificial intelligence (AI) and machine learning (ML) interchangeably. The distinction between the two may seem trivial – after all, machine learning is a subset of AI.

[ Do you understand the main types of AI? Read also: 5 artificial intelligence (AI) types, defined. ]

However, IT leaders and line-of-business leaders need to understand and be able to articulate the differences between AI and ML. As business interest in AI solutions grows, so too does the number of vendors flooding the market with "intelligent" solutions.

Many organizations fall into one of two camps: Overconfident or overwhelmed about AI and ML.

Without clarity on AI and ML, enterprises can end up pursuing misguided – and ultimately disappointing projects – or falling for fake AI solutions.

So let’s clear up some misunderstandings you may encounter and have to debunk. Many organizations fall into one of two camps: overconfident or overwhelmed about AI and ML. Neither is a good place to start, AI industry analysts say.

1. The big misunderstanding: How AI relates to ML

Picture a set of Russian nesting dolls: AI is the big one, ML sits just inside it, and other cognitive capabilities sit underneath them. “AI is the broad container term describing the various tools and algorithms that enable machines to replicate human behavior and intelligence,” explains JP Baritugo, director at management and IT consultancy Pace Harmon. There are numerous flavors of AI. Machine learning is one, but there’s also natural language processing (NLP), deep learning, computer vision, and more.

[ Get our quick-scan primer on 10 key artificial intelligence terms for IT and business leaders: Cheat sheet: AI glossary. ]

For those who prefer analogies, Timothy Havens, the William and Gloria Jackson Associate Professor of Computer Systems in the College of Computing at Michigan Technological University and director of the Institute of Computing and Cybersystems, likens the way AI works to learning to ride a bike: “You don’t tell a child to move their left foot in a circle on the left pedal in the forward direction while moving your right foot in a circle… You give them a push and tell them to keep the bike upright and pointed forward: the overall objective. They fall a few times, honing their skills each time they fail,” Havens says. “That’s AI in a nutshell.”

Machine learning is one way to accomplish that. It uses statistical analysis to learn autonomously and improve its function, explains Sarah Burnett, executive vice president and distinguished analyst at management consultancy and research firm Everest Group.

“[ML] uses various algorithms to analyze data, discern patterns, and generate the requisite outputs,” says Pace Harmon’s Baritugo, adding that machine learning is the capability that drives predictive analytics and predictive modeling.

[ What’s next? Read also: 10 AI trends to watch in 2020 and How big data and AI work together. ]

2. AI itself is not a single thing

“AI is a collection of hundreds of different strands,” says Wayne Butterfield, director of cognitive automation and innovation at ISG. “ML is a core component of many AI uses. It is the part of AI that enables a strand of AI to improve, whether that’s improving an image recognition algorithm to recognize a cat versus a car, or speech recognition being able to understand multiple accents in a given language.

“While people in your organization may be sold to by salespeople who position AI as a single thing, you need to push back on that notion.” 

This isn’t easy, Butterfield notes. But the more IT leaders can clarify within their organizations what AI, ML, and other branches of cognitive capabilities are and aren’t, and – even better ­– do so within the context of business solutions, the fewer misunderstandings will recur.

“It will come as AI becomes more commonly understood,” Butterfield says. “The catch-all terms become less relevant as the nuances of the AI spectrum become much wider-known, and it is these aspects that we start to discuss in the future.”

3. You don't need data scientists to begin exploring AI or ML

There are numerous off-the-shelf solutions that incorporate ML or another form of AI that organizations can take advantage of to get more familiar with the capabilities, according to Burnett.

4. Factor in plenty of time for early experimentation

“Initially, it is important to learn about AI through trial and error and through different proofs of concept,” Burnett says.

5. Start with the business problem

The main culprit of unsuccessful or subpar digital transformation initiatives such as AI or ML tends to be using a technology-first approach, Baritugo says. “Instead, organizations need to determine what they are transforming to, how, and with whom. Articulating the organizational aspirations to enhance services, delivery, and/or the customer engagement model with AI or ML will help define the digital strategy.”

When it’s time to move beyond experimentation, set the tools aside. “You must be clear about what you want AI to do for you, what questions you want it to answer, what business problem you want it to solve,” Burnett says. “Collaboration with businesspeople is critical to make sure that you fully understand the business problem that you’re trying to address with AI.”

6. Don't underestimate data requirements

ML requires good data – and lots of it. “You need to work out what data you need, explore your data, and check and validate it, ensuring that the data provides a good sample for AI to learn and analyze,” Burnett says.

7. Keep your eye on the bigger picture

IT leaders need to identify how effectively AI or ML solutions scale within the enterprise and consider the technology stack required to enable them. “This process also includes addressing the organizational talent and ways of working to drive this change,” Baritugo points out.

8. Expect plenty of fine-tuning

“Be prepared for iterations,” Burnett advises. “The first attempt seldom delivers the solution. It is also important to not assume that AI will always be right. Always check the output for bias and address any problems by using better data sets or a larger sample size, for example.”

[ How can automation free up more staff time for innovation? Get the free eBook: Managing IT with Automation. ] 

Stephanie Overby is an award-winning reporter and editor with more than twenty years of professional journalism experience. For the last decade, her work has focused on the intersection of business and technology. She lives in Boston, Mass.