Neural Networks (Part I) – Understanding the Mathematics behind backpropagation

This is an excellent introduction to the mathematics of Artificial Neural Networks.

It’s worth understanding ANNs, as they are to the fore of a lot of recent advances in Machine Learning and AI.

It’s also important to understand the mathematics to know where problems can arise and what the limitations of this technique are.

biasvariance

Overview

Artificial Neural Networks (ANNs) are inspired by the biological nervous system to model the learning behavior of human brain. One of the most intriguing challenges for computer scientists is to model the human brain and effectively create a super-human intelligence that aids humanity in its course to achieve the next stage in evolution. Recent advancements have shown compelling bias towards neural networks owing to its increased accuracy. Neural networks have been shown to be useful to model many problems ranging from a vertical-based to a generic learning system.

Surprisingly, most of the developers using NNs to solve their daily problems do not go beyond using a NN library in a specific language of their choice. The necessity to understand the basic mathematics that governs this beautiful model remains out of scope. This post would be a start to help you open a NN black box and have a better…

View original post 1,235 more words

Leave a comment