Optimization techniques are used in the solution of many engineering problems today. Especially in cases where modeling cannot be done, solutions are achieved by applying iterative methods with the help of mathematical equations. These applications in linear or non–linear equation sets are now performed everywhere in parallel to the development of computer architectures.
Artificial neural networks, one of the mathematical problems that can be solved using optimization techniques, are frequently applied to find the solution of data/equation sets that cannot be modeled. These structures, which can establish a relationship between mathematically given output/input signals, are actively utilized in the imitation or classification of any signal. On the other hand, the solution for the given input/output data cannot be found analytically, unfortunately. Therefore, it requires the use of optimization techniques derived by iterative methods.
In order for artificial neural networks to reach a solution, a rule inference that minimizes a defined cost function is required. The name of this rule is given by learning methods. The backpropagation learning method is obtained by using the first–order derivative rule in order to minimize the error. Therefore, the extraction and application of this rule are necessary and essential for training an artificial neural network model.
Specification: Backpropagation Learning Method in Matlab