DEV Community

Cover image for What is Machine Learning?
Royce
Royce

Posted on

What is Machine Learning?

The term “Machine Learning” was coined by Arthur Samuel (1959), a pioneer of artificial intelligence, and defined it as a “Field of study that gives computers the ability to learn without being explicitly programmed.” Machine learning is the study of computer algorithms that improve with the use of data. Machine Learning and Data Science are sometimes, inaccurately, used interchangeably but they are considered distinct fields of study, despite overlapping in some areas. Data Science conducts operations over various data sources to prove or disprove a hypothesis, and Machine Learning use various data sources to develop software that learns as it extracts meaning from the data. Machine Learning is a branch of the field of Artificial Intelligence. Machine Learning uses statistical methods to train algorithms to make predictions, or classifications, through data mining projects. These insight drive decisions made businesses and continues to grow. Machine Learning has many real-world applications, such as medical diagnosis, traffic prediction, product recommendations and others.

Machine Learning vs Deep Learning
Machine Learning and Deep Learning are both subfields of Artificial Intelligence but often the terms are used interchangeably, so let us briefly point out some of the differences between them. The way they differ is in how the algorithm learns. Deep learning automates much of the feature, or an individual measurable property, extraction process and eliminating some of the human intervention part of the process therefore allowing use of larger data sets. Machine Learning is more dependent of human intervention to determine the set of features to understand differences in data inputs. Deep Learning does not need human interaction to process the information allowing Machine Learning to scale in interesting ways.

How Machines Learn
UC Berkeley divides the Machine Learning process into three main parts:

  1. A Decision Process: Machine Learning algorithms are used to make predictions, given a data set, which will predict patterns in the data.
  2. An Error Function: An error function evaluates the prediction model to access the accuracy of the model.
  3. A Model Optimization Process: If the model can better fit the data points in the training set, the weights are adjusted to improve the algorithm to better fit the training data and model estimate. This process is repeated until a certain level of accuracy is met.

Machine Learning Methods

Supervised Learning
In supervised learning, you would feed in an input-output pair and the algorithm would determine a pattern between the two. As you feed in new inputs, the algorithm would predict which output the input would belong to. The key to the accuracy, as is the case with all machine learning, is that the largest the training data the more accurate the model will be. Real-world examples of supervised learning are product recommendation (when a streaming service suggests a new movie or song you might like based on previous choices) and image classification (when someone is tagged in photos on social media and the app suggest other photos they might be in).

Unsupervised Learning
In unsupervised learning, you would feed an input (without the associated output) and as you repeat process the algorithm would group, or cluster, like values together to determine a pattern. As noted previously, the key to accurate modeling is the size of the initial data set. A real-world example would be bank fraud detection (since credit-card charges do not come with a “fraud” or “not fraud” tag fixed to them).

Reinforcement Learning
In Reinforcement learning, an input is feed to the algorithm and based on previous data, the algorithm would determine an output. The process would be refined using a “rewards/punishment” system and the program would learn from trial and error.

Machine Learning is an ever-growing field of research and continues to influence our lives as we continue to push the bounds of artificial intelligence. I, myself, am excited to learn more about it to gain more insight.

Top comments (0)