Artificial Neural Networks are a powerful tool for the purpose of **prediction** and **recognition of patterns**. This just a introduction to Neural Networks in a compacted manner. It is advised to have a more detailed study after going through the following article.

For starters, you need a good set of programming skills in * C or C++* to design a neural network. And if you are designing using

*, no such programming skills are required, Neural Networks tool is pre-embedded in it.*

**MATLAB****When Neural Networks are Used**

The basic purpose of a neural network is to * predict certain output data for a corresponding input data*. For example consider a case in manufacturing industry where the productivity depends on the efficiency of worker, efficiency of machine, number of workers working and other miscellaneous factors. We have the data for the past 24 months and need to predict the productivity for the next month, in such a situation we use the help of Neural Networks to predict its value. Here Productivity will be the desired output while others factors mentioned above would be the inputs. So if we give the value of these inputs to a custom designed neural network, we would obtain a output.

**BASICS**

A neural network has three basic layers: * an input layer, a hidden layer* (can be more than one) and

*.*

**an output layer**

Data is processed in a sequential order in different layers and desired output is obtained only after a proper network is designed.

* Each circle in the above picture is called a* node* or

*and each layer has*

**a neuron***many number of nodes*depending on our choice. All computations are done inside the nodes and processed data is carried forward for the next set of computations.

* The computation in each of the nodes is done using a function called * Activation function*. There are a number of activation functions, for example sigmoid function, and the choice depends on the work.

* In the above figure, we can see that each node in a layer is connected several other nodes in the previous layer. The nodes in the previous layer are connected to present ones using * weights and biases*. The following picture explains it.

Here f(u) is the activation function.

*Another important parameter is the selection of * training algorithm*, which is used in the training process of a network and which will be discussed later. There a number of training algorithms and the most popular one is a

*. Each one has its own pros and cons.*

**Back Propagation****Algorithm*** The selection of * type of network* also has an impact on the efficiency of the network. One example is of a type of network is a feed-forward network. The type of network chosen comes into picture when a network is run after training.

**PROCEDURE FOR DESIGNING A NEURAL NETWORK**

There are three basic steps involved: **Training, Validation and Prediction**. The available data should be divided accordingly. A thumb rule would be to have 80% of available data for training and remaining 20% for validation. The steps involved in designing a neural network are:

1. Fixing the structure of the network: choosing the number of hidden layers, number of nodes in each layer and activation function for each layer.

2. Giving initial weights and biases if designing through the use of programming languages.

3. Giving the required data and training the network.

4. Once training of a network is done, its validation must be done with the remaining set of data. This is an important step since the efficiency of network can be determined based on this step. If desirable results are not obtained the network must be re-designed and trained again and re-validated. So it is a trail and error method.

5. Once validated and the possible range of errors is established, the network can be run to predict the value and it should be represented with the possible percentage of error which in normal cases lied around 10%.

* NOTE*:

*Increasing the number of nodes in a layer and the number of layers allows the network to solve more complicated problems, but it also increases the computational space required, so a balance must be maintained*.