Term Paper Neural Networks
Such ideas were appealing but very difficult to implement.In addition, von Neumann architecture was gaining in popularity.0 or 1) according to the rule: Weight Change = (Pre-Weight line value) * (Error / (Number of Inputs)).It is based on the idea that while one active perceptron may have a big error, one can adjust the weight values to distribute it across the network, or at least to adjacent perceptrons.Applying this rule still results in an error if the line before the weight is 0, although this will eventually correct itself.
This was coupled with the fact that the early successes of some neural networks led to an exaggeration of the potential of neural networks, especially considering the practical technology at the time.
In 1943, neurophysiologist Warren Mc Culloch and mathematician Walter Pitts wrote a paper on how neurons might work.
In order to describe how neurons in the brain might work, they modeled a simple neural network using electrical circuits.
In 1949, Donald Hebb wrote The Organization of Behavior, a work which pointed out the fact that neural pathways are strengthened each time they are used, a concept fundamentally essential to the ways in which humans learn.
If two nerves fire at the same time, he argued, the connection between them is enhanced.