Preferred Language
Articles
/
jih-396
Influence Activation Function in Approximate Periodic Functions Using Neural Networks
...Show More Authors

The aim of this paper is to design fast neural networks to approximate periodic functions, that is, design a fully connected networks contains links between all nodes in adjacent layers which can speed up the approximation times, reduce approximation failures, and increase possibility of obtaining the globally optimal approximation. We training suggested network by Levenberg-Marquardt training algorithm then speeding suggested networks by choosing most activation function (transfer function) which having a very fast convergence rate for reasonable size networks.             In all algorithms, the gradient of the performance function (energy function) is used to determine how to adjust the weights such that the performance function is minimized, where the back propagation algorithm has been used to increase the speed of training.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon Jun 30 2008
Journal Name
Iraqi Journal Of Science
On the Greedy Ridge Function Neural Networks for Approximation Multidimensional Functions
...Show More Authors

The aim of this paper is to approximate multidimensional functions f∈C(R^s) by developing a new type of Feedforward neural networks (FFNS) which we called it Greedy ridge function neural networks (GRGFNNS). Also, we introduce a modification to the greedy algorithm which is used to train the greedy ridge function neural networks. An error bound are introduced in Sobolov space. Finally, a comparison was made between the three algorithms (modified greedy algorithm, Backpropagation algorithm and the result in [1]).

Preview PDF
Publication Date
Fri Jun 01 2007
Journal Name
Journal Of Al-nahrain University Science
ON THE GREEDY RADIAL BASIS FUNCTION NEURAL NETWORKS FOR APPROXIMATION MULTIDIMENSIONAL FUNCTIONS
...Show More Authors

The aim of this paper is to approximate multidimensional functions by using the type of Feedforward neural networks (FFNNs) which is called Greedy radial basis function neural networks (GRBFNNs). Also, we introduce a modification to the greedy algorithm which is used to train the greedy radial basis function neural networks. An error bound are introduced in Sobolev space. Finally, a comparison was made between the three algorithms (modified greedy algorithm, Backpropagation algorithm and the result is published in [16]).

View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Wed May 24 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
On Comparison between Radial Basis Function and Wavelet Basis Functions Neural Networks
...Show More Authors

      In this paper we study and design two feed forward neural networks. The first approach uses radial basis function network and second approach uses wavelet basis function network to approximate the mapping from the input to the output space. The trained networks are then used in an conjugate gradient algorithm to estimate the output. These neural networks are then applied to solve differential equation. Results of applying these algorithms to several examples are presented

View Publication Preview PDF
Publication Date
Mon Jun 17 2019
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Dynamic Channel Assignment Using Neural Networks
...Show More Authors

This paper presents a proposed neural network algorithm to solve the shortest path problem (SPP) for communication routing. The solution extends the traditional recurrent Hopfield architecture introducing the optimal routing for any request by choosing single and multi link path node-to-node traffic to minimize the loss. This suggested neural network algorithm implemented by using 20-nodes network example. The result shows that a clear convergence can be achieved by 95% valid convergence (about 361 optimal routes from 380-pairs). Additionally computation performance is also mentioned at the expense of slightly worse results.

View Publication Preview PDF
Publication Date
Mon Mar 08 2021
Journal Name
Baghdad Science Journal
on periodic point and chaotic functions
...Show More Authors

We dealt with the nature of the points under the influence of periodic function chaotic functions associated functions chaotic and sufficient conditions to be a very chaotic functions Palace

View Publication Preview PDF
Crossref
Publication Date
Sat Dec 02 2017
Journal Name
Al-khwarizmi Engineering Journal
Direction Finding Using GHA Neural Networks
...Show More Authors

 This paper adapted the neural network for the estimating of the direction of arrival (DOA). It uses an unsupervised adaptive neural network with GHA algorithm to extract the principal components that in turn, are used by Capon method to estimate the DOA, where by the PCA neural network we take signal subspace only and use it in Capon (i.e. we will ignore the noise subspace, and take the signal subspace only).

 

 

View Publication Preview PDF
Publication Date
Tue Jan 01 2019
Journal Name
Energy Procedia
The effect of the activation functions on the classification accuracy of satellite image by artificial neural network
...Show More Authors

View Publication
Scopus (14)
Crossref (12)
Scopus Clarivate Crossref
Publication Date
Sat Jun 27 2020
Journal Name
Iraqi Journal Of Science
The Performance Differences between Using Recurrent Neural Networks and Feedforward Neural Network in Sentiment Analysis Problem
...Show More Authors

 With the spread use of internet, especially the web of social media, an unusual quantity of information is found that includes a number of study fields such as psychology, entertainment, sociology, business, news, politics, and other cultural fields of nations. Data mining methodologies that deal with social media allows producing enjoyable scene on the human behaviour and interaction. This paper demonstrates the application and precision of sentiment analysis using traditional feedforward and two of recurrent neural networks (gated recurrent unit (GRU) and long short term memory (LSTM)) to find the differences between them. In order to test the system’s performance, a set of tests is applied on two public datasets. The firs

... Show More
View Publication Preview PDF
Scopus (3)
Scopus Crossref
Publication Date
Tue Jan 01 2019
Journal Name
International Journal Of Machine Learning And Computing
Facial Emotion Recognition from Videos Using Deep Convolutional Neural Networks
...Show More Authors

Its well known that understanding human facial expressions is a key component in understanding emotions and finds broad applications in the field of human-computer interaction (HCI), has been a long-standing issue. In this paper, we shed light on the utilisation of a deep convolutional neural network (DCNN) for facial emotion recognition from videos using the TensorFlow machine-learning library from Google. This work was applied to ten emotions from the Amsterdam Dynamic Facial Expression Set-Bath Intensity Variations (ADFES-BIV) dataset and tested using two datasets.

View Publication Preview PDF
Scopus (42)
Crossref (32)
Scopus Crossref
Publication Date
Sun Apr 30 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Fast Training Algorithms for Feed Forward Neural Networks
...Show More Authors

 The aim of this paper, is to discuss several high performance training algorithms fall into two main categories. The first category uses heuristic techniques, which were developed from an analysis of the performance of the standard gradient descent algorithm. The second category of fast algorithms uses standard numerical optimization techniques such as: quasi-Newton . Other aim is to solve the drawbacks related with these training algorithms and propose an efficient training algorithm for FFNN

View Publication Preview PDF