These days, it is crucial to discern between different types of human behavior, and artificial intelligence techniques play a big part in that. The characteristics of the feedforward artificial neural network (FANN) algorithm and the genetic algorithm have been combined to create an important working mechanism that aids in this field. The proposed system can be used for essential tasks in life, such as analysis, automation, control, recognition, and other tasks. Crossover and mutation are the two primary mechanisms used by the genetic algorithm in the proposed system to replace the back propagation process in ANN. While the feedforward artificial neural network technique is focused on input processing, this should be based on the process of breaking the feedforward artificial neural network algorithm. Additionally, the result is computed from each ANN during the breaking up process, which is based on the breaking up of the artificial neural network algorithm into multiple ANNs based on the number of ANN layers, and therefore, each layer in the original artificial neural network algorithm is assessed. The best layers are chosen for the crossover phase after the breakage process, while the other layers go through the mutation process. The output of this generation is then determined by combining the artificial neural networks into a single ANN; the outcome is then checked to see if the process needs to create a new generation. The system performed well and produced accurate findings when it was used with data taken from the Vicon Robot system, which was primarily designed to record human behaviors based on three coordinates and classify them as either normal or aggressive.
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.
Often phenomena suffer from disturbances in their data as well as the difficulty of formulation, especially with a lack of clarity in the response, or the large number of essential differences plaguing the experimental units that have been taking this data from them. Thus emerged the need to include an estimation method implicit rating of these experimental units using the method of discrimination or create blocks for each item of these experimental units in the hope of controlling their responses and make it more homogeneous. Because of the development in the field of computers and taking the principle of the integration of sciences it has been found that modern algorithms used in the field of Computer Science genetic algorithm or ant colo
... Show MoreA Multiple System Biometric System Based on ECG Data
A multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i
... Show MoreAbstract:
Interest in the topic of prediction has increased in recent years and appeared modern methods such as Artificial Neural Networks models, if these methods are able to learn and adapt self with any model, and does not require assumptions on the nature of the time series. On the other hand, the methods currently used to predict the classic method such as Box-Jenkins may be difficult to diagnose chain and modeling because they assume strict conditions.
... Show More
Abstract
Zigbee is considered to be one of the wireless sensor networks (WSNs) designed for short-range communications applications. It follows IEEE 802.15.4 specifications that aim to design networks with lowest cost and power consuming in addition to the minimum possible data rate. In this paper, a transmitter Zigbee system is designed based on PHY layer specifications of this standard. The modulation technique applied in this design is the offset quadrature phase shift keying (OQPSK) with half sine pulse-shaping for achieving a minimum possible amount of phase transitions. In addition, the applied spreading technique is direct sequence spread spectrum (DSSS) technique, which has
... Show MoreThis study was conducted with the aim to extract and purify a polyphenolic compound “ Resveratrol†from the skin of black grapes Vitis vinifera cultivated in Iraq. The purified resveratrol is obtained after ethanolic extraction with 80% v/v solution for fresh grape skin, followed by acid hydrolysis with 10% HCl solution then the aglycon moiety was taken with organic solvent
( chloroform). Using silica gel G60 packed glass column chromatography with mobile phase benzene: methanol: acetic acid 20:4:1 a
... Show MoreThe aim of this research is to find out the influence of Daniel's model on the skills of the twenty-first century among the students of the scientific-fifth grade at the secondary and preparatory government morning schools for the academic year 2022- 2023. Two groups were chosen out of five groups for the fifth-scientific grade, one of which represents the experimental group that is taught by the Daniel model, and the other is the control group that is taught in the traditional method. The equivalence of the two research groups was verified with a set of variables. As for the research tool, a scale was developed by the researchers for the skills of the twenty-first century, in which they adopted the framework of the Partnership Organizat
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show More