Artificial Intelligence, Deep Learning, and Neural Networks, Explained

Artificial Intelligence

This article is meant to explain the concepts of AI, deep learning, and neural networks at a level that can be understood by most non-practitioners, and can also serve as a reference or review for technical folks as well.

  Artificial Intelligence (AI), deep learning, and neural networks represent incredibly exciting and powerful machine learning-based techniques used to solve many real-world problems. For a primer on machine learning, you may want to read this five-part series that I wrote.

While human-like deductive reasoning, inference, and decision-making by a computer is still a long time away, there have been remarkable gains in the application of AI techniques and associated algorithms.

The concepts discussed here are extremely technical, complex, and based on mathematics, statistics, probability theory, physics, signal processing, machine learning, computer science, psychology, linguistics, and neuroscience.

That said, this article is not meant to provide such a technical treatment, but rather to explain these concepts at a level that can be understood by most non-practitioners, and can also serve as a reference or review for technical folks as well.

The primary motivation and driving force for these areas of study, and for developing these techniques further, is that the solutions required to solve certain problems are incredibly complicated, not well understood, nor easy to determine manually.

Increasingly, we rely on these techniques and machine learning to solve these problems for us, without requiring explicit programming instructions. This is critical for two reasons. The first is that we likely wouldn’t be able, or at least know how to write the programs required to model and solve many problems that AI techniques are able to solve. Second, even if we did know how to write the programs, they would be inordinately complex and nearly impossible to get right.

Luckily for us, machine learning and AI algorithms, along with properly selected and prepared training data, are able to do this for us.

So with that, let’s get started!

  In order to define AI, we must first define the concept of intelligence in general. A paraphrased definition based on Wikipedia is:

While there are many different definitions of intelligence, they all essentially involve learning, understanding, and the application of the knowledge learned to achieve one or more goals.

It’s therefore a natural extension to say that AI can be described as intelligence exhibited by machines. So what does that mean exactly, when is it useful, and how does it work?

A familiar instance of an AI solution includes IBM’s Watson, which was made famous by beating the two greatest Jeopardy champions in history, and is now being used as a question answering computing system for commercial applications. Apple’s Siri andAmazon’s Alexa are similar examples as well.

In addition to speech recognition and natural language (processing, generation, and understanding) applications, AI is also used for other recognition tasks (pattern, text, audio, image, video, facial, …), autonomous vehicles, medical diagnoses, gaming, search engines, spam filtering, crime fighting, marketing, robotics, remote sensing, computer vision, transportation, music recognition, classification, and so on.

Something worth mentioning is a concept known as the AI effect. This describes the case where once an AI application has become somewhat mainstream, it’s no longer considered by many as AI. It happens because people’s tendency is to no longer think of the solution as involving real intelligence, and only being a application of normal computing.

This despite the fact that these applications still fit the definition of AI regardless of widespread usage. The key takeaway here is that today’s AI is not necessarily tomorrow’s AI, at least not in some people’s minds anyway.

There are many different goals of AI as mentioned, with different techniques used for each.

 

Share it:
Share it:

[Social9_Share class=”s9-widget-wrapper”]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

You Might Be Interested In

Why Artificial Intelligence Cannot Survive Without Big Data

31 Mar, 2018

It may come as no surprise that the internet has been swelling up with an increasing amount of data, so …

Read more

Big: Data, model, quality and variety

21 Jul, 2016

The “big” part of big data is about enabling insights that were previously indiscernible. It’s about uncovering small differences that …

Read more

Why event stream processing is leading the new ‘big data’ era

7 Aug, 2019

From the downhill of key technologies to new innovative solutions on the plate: Roberto Bentivoglio takes us swimming in the …

Read more

Recent Jobs

IT Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Data Engineer

Washington D.C., DC, USA

1 May, 2024

Read More

Applications Developer

Washington D.C., DC, USA

1 May, 2024

Read More

D365 Business Analyst

South Bend, IN, USA

22 Apr, 2024

Read More

Do You Want to Share Your Story?

Bring your insights on Data, Visualization, Innovation or Business Agility to our community. Let them learn from your experience.

Get the 3 STEPS

To Drive Analytics Adoption
And manage change

3-steps-to-drive-analytics-adoption

Get Access to Event Discounts

Switch your 7wData account from Subscriber to Event Discount Member by clicking the button below and get access to event discounts. Learn & Grow together with us in a more profitable way!

Get Access to Event Discounts

Create a 7wData account and get access to event discounts. Learn & Grow together with us in a more profitable way!

Don't miss Out!

Stay in touch and receive in depth articles, guides, news & commentary of all things data.