Preferred Language
Articles
/
jih-534
Fast Training Algorithms for Feed Forward Neural Networks
...Show More Authors

 The aim of this paper, is to discuss several high performance training algorithms fall into two main categories. The first category uses heuristic techniques, which were developed from an analysis of the performance of the standard gradient descent algorithm. The second category of fast algorithms uses standard numerical optimization techniques such as: quasi-Newton . Other aim is to solve the drawbacks related with these training algorithms and propose an efficient training algorithm for FFNN

View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Mar 10 2021
Journal Name
Baghdad Science Journal
On Training Of Feed Forward Neural Networks
...Show More Authors

In this paper we describe several different training algorithms for feed forward neural networks(FFNN). In all of these algorithms we use the gradient of the performance function, energy function, to determine how to adjust the weights such that the performance function is minimized, where the back propagation algorithm has been used to increase the speed of training. The above algorithms have a variety of different computation and thus different type of form of search direction and storage requirements, however non of the above algorithms has a global properties which suited to all problems.

View Publication Preview PDF
Publication Date
Tue Sep 19 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Density and Approximation by Using Feed Forward Artificial Neural Networks
...Show More Authors

I n  this  paper ,we 'viii  consider  the density  questions  associC;lted with  the single  hidden layer feed forward  model. We proved  that a FFNN   with   one   hidden   layer  can   uniformly   approximate   any continuous  function  in C(k)(where k is a compact set in R11 ) to any required accuracy.

 

However, if the set of basis function is dense then the ANN's can has al most one hidden layer. But if the set of basis function  non-dense, then we  need more  hidden layers. Also, we have shown  that there exist  localized functions and that there is no t

... Show More
View Publication Preview PDF
Publication Date
Wed May 03 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Designing Feed Forward Neural Network for Solving Linear VolterraIntegro-Differential Equations
...Show More Authors

The aim of this paper, is to design multilayer Feed Forward Neural Network(FFNN)to find the approximate solution of the second order linear Volterraintegro-differential equations with boundary conditions. The designer utilized to reduce the computation of solution, computationally attractive, and the applications are demonstrated through illustrative examples.

View Publication Preview PDF
Publication Date
Thu May 04 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Design Feed Forward Neural Network to Determine Doses of the Decongestant for Cold Pills
...Show More Authors

The aim of this paper is to design feed forward neural network to determine the effects of
cold pills and cascades from simulation the problem to system of first order initial value
problem. This problem is typical of the many models of the passage of medication throughout
the body. Designer model is an important part of the process by which dosage levels are set.
A critical factor is the need to keep the levels of medication high enough to be effective, but
not so high that they are dangerous.

View Publication Preview PDF
Publication Date
Sun Dec 02 2012
Journal Name
Baghdad Science Journal
Stability of Back Propagation Training Algorithm for Neural Networks
...Show More Authors

In this paper, we derive and prove the stability bounds of the momentum coefficient µ and the learning rate ? of the back propagation updating rule in Artificial Neural Networks .The theoretical upper bound of learning rate ? is derived and its practical approximation is obtained

View Publication Preview PDF
Crossref
Publication Date
Sun Aug 01 2021
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Cascade-Forward Neural Network for Volterra Integral Equation Solution
...Show More Authors

The method of solving volterra integral equation by using numerical solution is a simple operation but to require many memory space to compute and save the operation. The importance of this equation appeares new direction to solve the equation by using new methods to avoid obstacles. One of these methods employ neural network for obtaining the solution.

This paper presents a proposed method by using cascade-forward neural network to simulate volterra integral equations solutions. This method depends on training cascade-forward neural network by inputs which represent the mean of volterra integral equations solutions, the target of cascade-forward neural network is to get the desired output of this network. Cascade-forward neural

... Show More
View Publication Preview PDF
Crossref (3)
Crossref
Publication Date
Thu Sep 01 2022
Journal Name
Iraqi Journal Of Physics
Development and Assessment of Feed Forward Back Propagation Neural Network Models to Predict Sunshine Duration
...Show More Authors

         The duration of sunshine is one of the important indicators and one of the variables for measuring the amount of solar radiation collected in a particular area. Duration of solar brightness has been used to study atmospheric energy balance, sustainable development, ecosystem evolution and climate change. Predicting the average values of sunshine duration (SD) for Duhok city, Iraq on a daily basis using the approach of artificial neural network (ANN) is the focus of this paper. Many different ANN models with different input variables were used in the prediction processes. The daily average of the month, average temperature, maximum temperature, minimum temperature, relative humidity, wind direction, cloud level and atmosp

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Oct 17 2011
Journal Name
Journal Of Engineering
MODIFIED TRAINING METHOD FOR FEEDFORWARD NEURAL NETWORKS AND ITS APPLICATION in 4-LINK SCARA ROBOT IDENTIFICATION
...Show More Authors

In this research the results of applying Artificial Neural Networks with modified activation function to perform the online and offline identification of four Degrees of Freedom (4-DOF) Selective Compliance Assembly Robot Arm (SCARA) manipulator robot will be described. The proposed model of identification strategy consists of a feed-forward neural network with a modified activation function that operates in parallel with the SCARA robot model. Feed-Forward Neural Networks (FFNN) which have been trained online and offline have been used, without requiring any previous knowledge about the system to be identified. The activation function that is used in the hidden layer in FFNN is a modified version of the wavelet function. This approach ha

... Show More
Preview PDF
Publication Date
Sat Oct 01 2011
Journal Name
Journal Of Engineering
MODIFIED TRAINING METHOD FOR FEEDFORWARD NEURAL NETWORKS AND ITS APPLICATION in 4-LINK SCARA ROBOT IDENTIFICATION
...Show More Authors

In this research the results of applying Artificial Neural Networks with modified activation function to
perform the online and offline identification of four Degrees of Freedom (4-DOF) Selective Compliance
Assembly Robot Arm (SCARA) manipulator robot will be described. The proposed model of
identification strategy consists of a feed-forward neural network with a modified activation function that
operates in parallel with the SCARA robot model. Feed-Forward Neural Networks (FFNN) which have
been trained online and offline have been used, without requiring any previous knowledge about the
system to be identified. The activation function that is used in the hidden layer in FFNN is a modified
version of the wavelet func

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jun 17 2019
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Dynamic Channel Assignment Using Neural Networks
...Show More Authors

This paper presents a proposed neural network algorithm to solve the shortest path problem (SPP) for communication routing. The solution extends the traditional recurrent Hopfield architecture introducing the optimal routing for any request by choosing single and multi link path node-to-node traffic to minimize the loss. This suggested neural network algorithm implemented by using 20-nodes network example. The result shows that a clear convergence can be achieved by 95% valid convergence (about 361 optimal routes from 380-pairs). Additionally computation performance is also mentioned at the expense of slightly worse results.

View Publication Preview PDF