These days, it is crucial to discern between different types of human behavior, and artificial intelligence techniques play a big part in that. The characteristics of the feedforward artificial neural network (FANN) algorithm and the genetic algorithm have been combined to create an important working mechanism that aids in this field. The proposed system can be used for essential tasks in life, such as analysis, automation, control, recognition, and other tasks. Crossover and mutation are the two primary mechanisms used by the genetic algorithm in the proposed system to replace the back propagation process in ANN. While the feedforward artificial neural network technique is focused on input processing, this should be based on the process of breaking the feedforward artificial neural network algorithm. Additionally, the result is computed from each ANN during the breaking up process, which is based on the breaking up of the artificial neural network algorithm into multiple ANNs based on the number of ANN layers, and therefore, each layer in the original artificial neural network algorithm is assessed. The best layers are chosen for the crossover phase after the breakage process, while the other layers go through the mutation process. The output of this generation is then determined by combining the artificial neural networks into a single ANN; the outcome is then checked to see if the process needs to create a new generation. The system performed well and produced accurate findings when it was used with data taken from the Vicon Robot system, which was primarily designed to record human behaviors based on three coordinates and classify them as either normal or aggressive.
Poly aniline-formaldehyde/chitosan composite (PAFC) was prepared by the in situ polymerization method. It was characterized by FTIR spectroscopy in addition to SEM, EDS and TGA techniques. The adsorption kinetics of malachite green dye (MG) on (PAFC) were studied for various initial concentrations (20, 30 and 40) mg/L at three temperatures (308, 313 and 318) K. The influence factors of adsorption; adsorbent dose, contact time, initial concentration and temperature were investigated. The kinetic studies confirmed that adsorption of MG obeyed the pseudo-second-order model and the adsorption can be controlled through external mass transfer followed by intraparticle diffusion mass transfer. A study of th
The study aims to find out the effectiveness of using the Google classroom educational platform in teaching mathematics curricula from the viewpoint of teachers in the Governorate of Al Dhahirah, Sultanate of Oman. The researcher adopted the descriptive-analytical approach. To collect the needed data, a questionnaire of two dimensions was used. It includes (13) items to measure the effectiveness of using the Google classroom in teaching mathematics curricula from the teacher's point of view and includes (10) items to measure the difficulties of using the Google classroom in teaching mathematics curricula from the teachers' point of view. These tools were applied to (32) male and (31) female as the study sample. They represent mathematics
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreReliable data transfer and energy efficiency are the essential considerations for network performance in resource-constrained underwater environments. One of the efficient approaches for data routing in underwater wireless sensor networks (UWSNs) is clustering, in which the data packets are transferred from sensor nodes to the cluster head (CH). Data packets are then forwarded to a sink node in a single or multiple hops manners, which can possibly increase energy depletion of the CH as compared to other nodes. While several mechanisms have been proposed for cluster formation and CH selection to ensure efficient delivery of data packets, less attention has been given to massive data co
Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show More