AI, AR, VR, ML, DL... AR vs Machine Learning: is there a difference between these 2 technologies? Which one(s)?

Or do these 2 acronyms refer to the very same tech?

Keeping up with which tech does what, with parsing the differences between all the fancy 2-letter acronyms emerging these days becomes increasingly challenging.

Especially when the terms are often used interchangeably, like artificial intelligence and machine learning.

Now that's frustrating: how could you possibly distinguish a clear-cut demarcation line between such a broad concept and “catch-all” term as AI (or “machine intelligence”) and machine learning?

Time to shed some light here:
 

1. What Is Artificial Intelligence?

A more than succinct, yet descriptive enough definition would go something like this:

The capability of a machine to perform tasks that require human intelligence.

And here I'm referring to tasks such as:
 

  • recognizing images/voices
  • understanding languages, translating
  • planning
  • problem-solving
  • learning
     

In short: once a computer system reaches a level where it understands, analyzes, tells the difference between objects and makes decisions all by itself — based on understood criteria —  then we can already talk about artificial (or machine) intelligence. 

Now, a more detailed definition of artificial intelligence would be:

The theory and development of machines that mimic intelligent human behavior. That carry out tasks requiring human intelligence, in a more human-like way: they can reflect, make decisions, interact with humans and perform different complex tasks.
 

2. AI: Types and Applications

We couldn't talk about a complete and accurate “AI vs machine learning” analysis without focusing on the artificial intelligence typology and its specific applications.

Therefore, you should know that AI comes in two different “flavors”:
 

2.1. General AI

It involves broader applications:

A machine that learns to perform a wide range of complex tasks (that require human intelligence) and gains the ability to solve various problems in a human-like way.

Therefore, being broader in scope, general AI is harder to achieve than the “applied AI” alternative:

In fact, we don't yet have systems or devices capable to successfully handle any task that a human being can. That type of machine capable to mimic the human brain, to understand, interpret, respond to various stimuli...
 

2.2. Applied AI (or “Vertical” or “Weak” or “Narrow”)

Defining the applied or “weak” AI is crucial for properly identifying the clear-cut differences between AI and machine learning:

It's that type of artificial intelligence — of “smart” system — that addresses a specific need. That focuses on handling one single predefined task (e.g. personalizing ads or trading stocks).

But maybe a few examples would be more appropriate for you to grasp the full meaning of applied AI:
 

  • LinkedIn messaging
  • Netflix recommendations
  • Spotify discovery mode
  • Siri
     

3. AI vs Machine Learning: What Is Machine Learning More Precisely?

First of all, we should make it clear that:

Machine learning is a subset of artificial intelligence.

And if we are to detail this statement a bit:

Machine learning is that subcategory of AI that uses statistical techniques to identify patterns of repetition in databases. Once properly trained, it can analyze loads and loads of data sets, predict accurate outputs and sort new inputs all by itself (e.g. voice search).

For instance, after going through huge volumes of customer data, it can recommend the most appropriate products, based on online shoppers' past choices and search history.

Or it can trigger certain functionalities of a software based on a particular user's voice. 

“But what do you mean by “training” a machine learning?”

Here, I'm referring to “neural networks”.

Basically, for each machine learning there's a neuronal network programmer (or a team of them) who builds these networks for training and learning. And what he does more precisely is choose specific factors of importance to determine the outcome of a given situation.

And they keep “polishing” and further adjusting these factors (or “weighs”) in the outcome until the network reaches the proper result according to the given input.

Once the machine learning reaches that level where it's capable to understand and to adjust the factors of importance on its own, to deliver accurate results (in real-time), it will keep improving itself.

It will keep “learning” how to deliver more and more accurate results without any human intervention.

In short: you “feed” the algorithm with huge volumes of data and it will then learn, adjust itself and continuously evolve when it comes to determining the most accurate outcome of a situation.

Just think:
 

  • image recognition
  • voice recognition
     

Now, in an AI vs machine learning debate, one where we're trying to identify the differences between the two concepts, we can say that:

Artificial intelligence is the broad concept, whereas machine learning is the technology powering much of the development in the AI field. That machine learning is a type of AI that learns — while getting fed huge amounts of data  — and improves all by itself. 

With no human intervention to keep “telling” it which is the matching rule between new inputs and the most probable outputs.
 

4. In Conclusion...

What better way of ending this comparative analysis of the two terms/techs then by pinpointing the main differences between AI and machine learning in a shortlist? 

Therefore, here it goes:
 

  • while machine learning can exist without AI, the latter can not exist without ML (the main reason behind the confusion when using these terms and why their definitions are often interchanged)
  • once a machine can make a choice or any decision on its own, once it can spot the difference between 2 items, it grows into AI; then, there's more than machine learning technology that's being leveraged there
     

The END! 

Is it clearer for you now which is the key difference between the two concepts? Where precisely you should draw the demarcation line between these 2 technologies?

Recommended Stories

How to Scale a Web Application in Drupal: Latest Techniques to Easily Scale Your Web App with Drupal 8
It's a fact: “the next generation” of web apps aren't just extremely fast, they're highly scalable, as well. Which… (Read more)
RADU SIMILEANU / Dec 10 '2018
WebAssembly vs JavaScript: Is WASM Faster than JS? When Does JavaScript Perform Better?
“Will WebAssembly replace JavaScript by 20XX?” This is one of those “sensationalizing” news of the moment, right?… (Read more)
Silviu Serdaru / Dec 7 '2018
Must-Have Skills in the Age of AI: Stay Relevant and Competitive as a Developer
“AI will replace software developers by 20XX...” Does this kind of alarming forecast sound (too) familiar to you?… (Read more)
Adriana Cacoveanu / Dec 5 '2018