Artificial Intelligence, Cognitive Computing and Machine Learning – a primer – AI

This article was 1st published on our sister Site, The Internet Of All Things.

It is confusing when Artificial Intelligence (AI) and Cognitive Computing are used interchangeably. Happens all too often these days. Not to mention the arguments that the Machine Learning (ML), robotics all are a part of AI (in broader terms). So, here’s a quick primer….

Artificial Intelligence

AI is usually defined as the science of making computers do things that require intelligence when done by humans, is how Jack Copeland puts it on AlanTuring.net. Here, computer systems or machines simulate the human intelligence process. These comprise learning, reasoning and self-correction. In the case of AI; expert systems, speech recognition and machine vision.

John McCarthy, American computer scientist and cognitive scientist coined the term “artificial intelligence” in 1955.

AI can be categorised in various ways. As explained by Margaret Rouse, writer WhatIs.com, TechTarget: one of the simplest classification would be categorizing AI systems as either “weak AI” or “strong AI”.

Weak AI, also known as narrow AI, is an AI system that is designed and trained for a specific task. Virtual personal assistants, such as Apple’s Siri, are a form of weak AI.
Strong AI, also known as artificial general intelligence, is an AI system with generalized human cognitive abilities so that when presented with an unfamiliar task, it has enough intelligence to find a solution.

Moreover, the dramatic rise of AI has already started affecting the human behaviour as a consumer, given its usability.

Uses
Applications based on AI include the Netflix recommendation engine to Natural Language Processing (NLP) to AI-powered Alexa on Amazon’s Echo devices, to face recognition technology, the computer vision systems in driverless cars and even the spam filters in emails.

Machine Learning

Machine Learning (ML) is a subset of AI. This application helps software applications to automatically learn and improve from experience without being explicitly programmed,explains Rouse. ML aims at building computer programs (algorithms) that can access data and use it learn for themselves. These algorithms can receive input data and use statistical analysis to predict an output while updating outputs as new data becomes available.

The process:
* Identifying data
* Choosing the right algorithm
* Building an analytical model
* Training & reviewing the model on test data sets
* Running the model to generate scores
 

Uses
Remember shopping/searching for a product on the Internet and then being bombarded by related adverts? The reason being the use of ML by recommendation engines to personalise online commercials in almost real time. Also, fraud detection, spam filtering, network security threat detection, predictive maintenance and building news feeds too use ML.

Cognitive Computing

When a human thought processes is simulated in a computerised model, it is called Cognitive Computing. This thought is driven home by the ‘Turing Test’, created by English computer scientist, mathematician Alan Mathison Turing in 1950, that the computer as a machine can ‘mimic’ the way human brain works. And it does so with the help of data mining, pattern recognition and natural language processing. However, given their speed and quick processing, the computers are still not able to imitate the human behavior to the T.

As Bernard Marr states in ‘What Everyone Should Know About Cognitive Computing’, Cognitive Computing represents the third era of computing: we went from computers that could tabulate sums (1900s) to programmable systems (1950s), and now to cognitive systems.

Cognitive Computing vs AI
It is often argued that Cognitive Computing and AI overlap. Machine Learning, Neural Networks, NLP and Deep Learning all come within the AI purview. Analysts opine that in broader terms, cognitive computing assists humans in their decision-making process.
 

Uses
Cognitive computing applications may assist doctors in the diagnosis and the treatment of disease. Most sited example being of IBM Watson for Oncology, states Rouse. It has been used at Memorial Sloan Kettering Cancer Center to provide oncologists with evidence-based treatment options for cancer patients. When medical staff input questions, Watson generates a list of hypotheses and offers treatment options for doctors to consider, she explains.
Hence, algorithms power AI to solve a problem or identify patterns hidden in a data. Whereas, the objective of Cognitive Computing is to create algorithms that ‘mimic’ the human reasoning process and thereby, find a solution to a gamut of problems.


Source:

https://iq.intel.co.uk/cognitive-computing-vs-artificial-intelligence/

https://searchenterpriseai.techtarget.com/definition/AI-Artificial-Intelligence

Cognitive Computing vs Artificial Intelligence: what’s the difference?

https://searchenterpriseai.techtarget.com/definition/cognitive-computing
https://searchenterpriseai.techtarget.com/definition/machine-learning-ML
https://www.forbes.com/sites/bernardmarr/2016/03/23/what-everyone-should-know-about-cognitive-computing/#577a6a605088

Click here to opt-out of Google Analytics