###### Data Warehousing

#### Data Warehouse Architecture, Concepts and Components

Data Warehouse Concept The basic concept of a Data Warehouse is to facilitate a single version of truth...

- Details

Before we learn Backpropagation, let's understand:

A neural network is a group of connected I/O units where each connection has a weight associated with its computer programs. It helps you to build predictive models from large databases. This model builds upon the human nervous system. It helps you to conduct image understanding, human learning, computer speech, etc.

Back-propagation is the essence of neural net training. It is the method of fine-tuning the weights of a neural net based on the error rate obtained in the previous epoch (i.e., iteration). Proper tuning of the weights allows you to reduce error rates and to make the model reliable by increasing its generalization.

Backpropagation is a short form for "backward propagation of errors." It is a standard method of training artificial neural networks. This method helps to calculate the gradient of a loss function with respects to all the weights in the network.

In this tutorial, you will learn:

- What is Artificial Neural Networks?
- What is Backpropagation?
- How Backpropagation Works
- Why We Need Backpropagation?
- What is a Feed Forward Network?
- Types of Backpropagation Networks
- History of Backpropagation
- Backpropagation Key Points
- Best practice Backpropagation
- Disadvantages of using Backpropagation

Consider the following diagram

- Inputs X, arrive through the preconnected path
- Input is modeled using real weights W. The weights are usually randomly selected.
- Calculate the output for every neuron from the input layer, to the hidden layers, to the output layer.
- Calculate the error in the outputs

Error_{B}= Actual Output – Desired Output

- Travel back from the output layer to the hidden layer to adjust the weights such that the error is decreased.

Keep repeating the process until the desired output is achieved

Most prominent advantages of Backpropagation are:

- Backpropagation is fast, simple and easy to program
- It has no parameters to tune apart from the numbers of input
- It is a flexible method as it does not require prior knowledge about the network
- It is a standard method that generally works well
- It does not need any special mention of the features of the function to be learned.

A feedforward neural network is an artificial neural network where the nodes never form a cycle. This kind of neural network has an input layer, hidden layers, and an output layer. It is the first and simplest type of artificial neural network.

Two Types of Backpropagation Networks are:

- Static Back-propagation
- Recurrent Backpropagation

It is one kind of backpropagation network which produces a mapping of a static input for static output. It is useful to solve static classification issues like optical character recognition.

Recurrent backpropagation is fed forward until a fixed value is achieved. After that, the error is computed and propagated backward.

The main difference between both of these methods is: that the mapping is rapid in static back-propagation while it is nonstatic in recurrent backpropagation.

- In 1961, the basics concept of continuous backpropagation were derived in the context of control theory by J. Kelly, Henry Arthur, and E. Bryson.
- In 1969, Bryson and Ho gave a multi-stage dynamic system optimization method.
- In 1974, Werbos stated the possibility of applying this principle in an artificial neural network.
- In 1982, Hopfield brought his idea of a neural network.
- In 1986, by the effort of David E. Rumelhart, Geoffrey E. Hinton, Ronald J. Williams, backpropagation gained recognition.
- In 1993, Wan was the first person to win an international pattern recognition contest with the help of the backpropagation method.

- Simplifies the network structure by elements weighted links that have the least effect on the trained network
- You need to study a group of input and activation values to develop the relationship between the input and hidden unit layers.
- It helps to assess the impact that a given input variable has on a network output. The knowledge gained from this analysis should be represented in rules.
- Backpropagation is especially useful for deep neural networks working on error-prone projects, such as image or speech recognition.
- Backpropagation takes advantage of the chain and power rules allows backpropagation to function with any number of outputs.

Backpropagation can be explained with the help of "Shoe Lace" analogy

- Not enough constraining and very loose

- Too much constraint (overtraining)
- Taking too much time (relatively slow process)
- Higher likelihood of breaking

- Discomfort (bias)

- The actual performance of backpropagation on a specific problem is dependent on the input data.
- Backpropagation can be quite sensitive to noisy data
- You need to use the matrix-based approach for backpropagation instead of mini-batch.

- A neural network is a group of connected it I/O units where each connection has a weight associated with its computer programs.
- Backpropagation is a short form for "backward propagation of errors." It is a standard method of training artificial neural networks
- Backpropagation is fast, simple and easy to program
- A feedforward neural network is an artificial neural network.
- Two Types of Backpropagation Networks are 1)Static Back-propagation 2) Recurrent Backpropagation
- In 1961, the basics concept of continuous backpropagation were derived in the context of control theory by J. Kelly, Henry Arthur, and E. Bryson.
- Backpropagation simplifies the network structure by removing weighted links that have a minimal effect on the trained network.
- It is especially useful for deep neural networks working on error-prone projects, such as image or speech recognition.
- The biggest drawback of the Backpropagation is that it can be sensitive for noisy data.

Data Warehouse Concept The basic concept of a Data Warehouse is to facilitate a single version of truth...

$20.20 $9.99 for today 4.6 (115 ratings) Key Highlights of Data Warehouse PDF 221+ pages eBook...

Reporting tools are software that provides reporting, decision making, and business intelligence...

What is DataStage? Datastage is an ETL tool which extracts data, transform and load data from...

What is Teradata? Teradata is massively parallel open processing system for developing large-scale data...

What is Data Modelling? Data modeling (data modelling) is the process of creating a data model for the...