Defining “Cognitive Analytics”

I wrote a couple of posts on this subject recently (link, link) and wanted to elaborate a bit more – this is an exciting subject to think about and work on.

Let me start with my own definition of Cognitive Analytics (which is quite different from that used by IBM):

Cognitive Analytics refers to a class of automated, autonomous, self-learning algorithms capable of data collection, analysis, interpretation, pattern discovery, and event forecasting that evolves over time and mimics the way a human would collect, analyze, and interpret data, as well as discover patterns and forecast future event of interest.

Natural language processing or special infrastructure or anything else is a bonus but isn’t essential to this subject, in my opinion.

The above definition would work for a child left alone in a large house and allowed to move around, try different things, open doors to many rooms, go into the attic, and to learn from his own experience without any parental supervision.  The longer this child is doing it, the better he understands (and predicts) his environment, figures out what is normal and what is not, what is dangerous and what is not, etc.

To illustrate the above, I will use an example of a “cognitive” learning algorithm developed at Alchemy IoT for the purpose of detecting anomalies associated with IoT devices to efficiently describe and predict health issues and their maintenance needs.  But, in this particular case, the algorithm was “asked” to learn about the anomalies generated by just a couple of sensors (temperature and light) placed in the middle of the office room seating 5 software development and data analytics folks.

The Arduino device collecting this data is shown in the picture below.  The third sensor is a button one can push to transmit “error conditions” and could be ignored in our case as irrelevant.

So, the data is collected every 5 seconds from these two sensors 24/7 and sent to our Alchemy IoT cloud for real-time processing, storage, and learning.  Our proprietary learning anomaly detector is analyzing the incoming data for the purpose of detecting truly anomalous events in the noisy environment of the room that has things like:

  • natural light from the windows (from dawn till dusk) + overhead light (from the morning till the evening)
  • temperature-controlled environment which kicks in every once in awhile
  • random heat and shadows from the people walking and standing around

Below is the trace that compares our “instantaneous health index” (lower portion of the chart called “Unschooled”) with the learning health index (upper portion of the chart called “Learning”).  Two reported states are possible – “yellow” for “anomaly” and “green” for “everything is normal”.  Two red health reports correspond to two clicks of the “error” button attached to the Arduino device (“red” means “errors”). The horizontal axis is time and the time window always corresponds to the “last 24 hours”.

Now, notice that the dominating state is “green” because most of what is happening during 24 hours is considered by our anomaly detection engine as “normal”.  Some “yellow” events shown correspond to “anomalies” (we use some complex proprietary ML-based anomaly-detection techniques to identify them).  Notice that both charts look identical.  And this is because the time for learning was too short and nothing had learned so far.

The next figure shows the same plot but 4 days later.  Notice the difference between the “unschooled” and “learning” health index:

Continue reading

Posted in AI, artificial intelligence, machine learning, deep learning, Analytics, data analytics, big data, big data analytics, data on the internet, data analytics meaning, Computers, Data Analysis and Visualization, IoT, Internet of things, smart connected devices, IoT analytics, The future of artificial intelligence | Tagged , , , , , , | Leave a comment

K-Means Clustering in Action

Clustering is one of the most popular unsupervised advanced analytics technique.

Watch this example of a real-time simulation of the K-Means Clustering Algorithm using different values for n (number of points) and k (number of clusters).

Read more on this subject here:  link

Posted in AI, artificial intelligence, machine learning, deep learning, Analytics, data analytics, big data, big data analytics, data on the internet, data analytics meaning, Data Analysis and Visualization, Videos, movies, and films | Tagged , , | Leave a comment

What is Edge IoT and Edge Computing

The Cloud is the main focus area for business and technology activities today.  And Cloud-centric companies−such as Amazon, Google, or Microsoft−are doing very well.  And most of us think it is safe to assume that the future of IoT (Internet of Things), and even Big Data Analytics, is in the Cloud.

However, there are some people and companies who think that a lot of IoT Analytics will be done locally, closer to where the action is, at the very edge of corporate and public networks, and sending data to the Cloud for processing will not be too frequent.  And that is what Edge Analytics/Edge Computing/Fog Computing is supposed to do to support the Edge IoT.

HPE, for example, is one of the companies that is active in this area. The picture below (from HPE) shows their thoughts on where the Edge starts and the Cloud ends:

The main focus of Edge Computing is the following:

While the data is moving from the device to the Cloud, Edge computing moves analytics/computing/decision-making from the Cloud towards the device and closer to sensor data.

This occurs because, in many cases, it

  • reduces the cost of transferring lots of data back and forth
  • decreases the reaction time, thus making it truly “real-time” (imagine streaming real-time data to the Cloud for an FFT vs. doing it on the spot)
  • makes all the local decision-making less dependent on the availability of Wi-Fi and Cloud, on IT security restrictions, etc. (While the Internet coverage of the world is increasing every year, it will not be 100% for some time)

Here is another view of the same concept.  Notice that, while the data is still moving to the Cloud, where some “deep analysis and compute” takes place, the decision-making is being moved to the left, closer to the device and its sensor data:

Continue reading

Posted in Computers | Leave a comment

On Cognitive Analytics

Check this Cognitive Analytics timeline from Deloitte.

CognitiveAnalytics-Print-Final-v3

Deloitte published an article on this subject in 2014, titled Cognitive Analytics. The article lists three key elements of Cognitive Analytics:

In practical terms, cognitive analytics is an extension of cognitive computing, which is made up of three main components: machine learning, natural language processing, and advancements in the enabling infrastructure.

Obviously, my opinion hasn’t changed since my last post on this subject:  this definition of Cognitive Analytics, used today, is not sufficient for a new distinct field and is more of a marketing term.

Rather, Cognitive Analytics still looks like a field of research, at the intercept of several other well-established fields of research, that is still looking for its identity and a new, distinctive name.

Posted in AI, artificial intelligence, machine learning, deep learning, Analytics, data analytics, big data, big data analytics, data on the internet, data analytics meaning, Computers, Past, present, and future | Tagged | Leave a comment

New Champion Teaser – League of Legends CGI

This new CGI looks good…  Xayah and Rakan will be joining the game’s lineup this year.

But this teaser is not as great, IMHO, as the  older version, which was simply fantastic:

Posted in CGI, computer graphics, and computer animations, Computers, Past, present, and future, Videos, movies, and films | Tagged , , , | Leave a comment

Programmable Self-Assembly in a Thousand-Robot Swarm

This video covers some interesting project done at Harvard University recently:

“Form a sea star shape,” directs a computer scientist, sending the command to 1,024 little bots simultaneously via an infrared light. The robots begin to blink at one another and then gradually arrange themselves into a five-pointed star. “Now form the letter K.”

And now, see how it actually happens:

Read more here: 2014-08-autonomous-robots-self-organizing-thousand-robot-swarm

Posted in Computers, IoT, Internet of things, smart connected devices, IoT analytics, Past, present, and future, Research, Robots, robotics, intelligent machines, singularity, Videos, movies, and films | Tagged | Leave a comment

The Hierarchy of IoT “Thing” Needs

Below is the Hierarchy of IoT “Thing” Needs.  It’s sort of an outlook on IoT from IoT’s point of view.  The absolute necessities (and things that need to be taken care of first) are at the bottom.  The ultimate goals, future directions, and what will “make IoT happy” are at the top:

Obviously, this is to follow Maslow’s hierarchy of needs for human beings, which looks like this:

Continue reading

Posted in IoT, Internet of things, smart connected devices, IoT analytics, Past, present, and future, Research | Tagged , , | Leave a comment