Basics of Artificial Neural Network

Deep Learning is a subset of Machine learning which deals with Neural network concept.

Nishant Kumar
3 min readJun 12, 2020

ANN :- Artificial Neural Networks is a computing systems inspired from Biological neurons of Human Brain.

Credit :- mc.ai

Let’s understand Biological Neurons first :-

Neurons (also known as neurons, nerve cells and nerve fibres) are electrically excitable cells in the nervous system that function to process and transmit information. In vertebrate animals, neurons are the core components of the brain, spinal cord and peripheral nerves.

credit:- simple wikipedia

Neurons communicate via chemical and electrical synapses, in a process known as synaptic transmission.The fundamental process underlying synaptic transmission is the action potential, a propagating electrical signal that is generated by exploiting the electrically excitable membrane of the neuron.Neurons are highly specialised for the fast processing and transmission of cellular signals.

One of differences between machine learning and deep learning model is on the feature extraction area. Feature extraction is done by human in machine learning whereas deep learning model figure out by itself.

Neural networks are inspired from mammalian brain.A neural network is a web of artificial neurons.Similar to Human Neuron’s electrical synapses, Artificial Neuron uses activation function for better propagation.If there was no activation function f, the output of the entire neural network would be a linear function of the inputs.A neuron takes as input a weighted sum of the outputs of the previous layer and a bias term, and passes it through some activation function.

output = activation(weighted sum of inputs)

z = w1x1 + w2x2 + ….+wnxn +b
y = f(z) = f(w1x1 + w2x2 + ….+wnxn +b)

w refers to weights, b bias, xi input features, and f the activation function.Owing to a biological parallel, the function f is called the neuron activation function (and sometimes transfer function).

Training using DL NN

Types of Activation Funtions:-

Graph

ReLU (Rectified Linear Unit) Activation Function is most popular and widely used. currently preferred due to fast convergence
Softmax: currently preferred for output of a classification net. Generalised sigmoid.
Linear: good for modeling a range in the output of a regression net.

Layers of Neural Network:-

Layers of NN

Input Layer
This is literally the layer that inputs information for the neural network to process. Each circle represents 1 feature (a piece of information).

Hidden Layer
The hidden layers perform nonlinear transformations of the inputs entered into the network.Number of layers vary based on complexity of input.

Output Layer
Calculated result after activation

Flow of signals/data

In DL, you will hear lot about TensorFlow+Keras in Python to solve problems and GPUs

GPU :- Graphics Processing Unit
It is designed specifically for performing the complex mathematical and geometric calculations that are necessary for graphics/image rendering.It is fast in matrix multiplication.It has many cores and does parallel computing.
ATI and nVidia produce the vast majority of GPUs on the market, and both companies have developed their own enhancements for GPU performance. Tensor Cores in NVIDIA GPUs can perform large matrix operations.

Happy Learning!! All the Best….

Facebook | Instagram

--

--