A strong sign language recognition system can break down the barriers that separate hearing and speaking members of society from speechless members. A novel fast recognition system with low computational cost for digital American Sign Language (ASL) is introduced in this research. Different image processing techniques are used to optimize and extract the shape of the hand fingers in each sign. The feature extraction stage includes a determination of the optimal threshold based on statistical bases and then recognizing the gap area in the zero sign and calculating the heights of each finger in the other digits. The classification stage depends on the gap area in the zero signs and the number of opened fingers in the other signs as well as the sequence in which the opened fingers appear for those that have the same number of opened fingers. The conducted test results showed the system’s high capability to classify all the digits; where both the precision and F-score percentages of the proposed model reached the desired optimal value (100%).
Adsorption of lead ions from wastewater by native agricultural waste, precisely tea waste. After the activation and carbonization of tea waste, there was a substantial improvement in surface area and other physical characteristics which include density, bulk density, and porosity. FTIR analysis indicates that the functional groups in tea waste adsorbent are aromatic and carboxylic. It can be concluded that the tea waste could be a good sorbent for the removal of Lead ions from wastewater. Different dosages of the adsorbents were used in the batch studies. A random series of experiments indicated a removal degree efficiency of lead reaching (95 %) at 5 ppm optimum concentration, with adsorbents R2 =97.75% for tea. Three mo
... Show MoreThe degradation of Toluidine Blue dye in aqueous solution under UV irradiation is investigated by using photo-Fenton oxidation (UV/H2O2/Fe+). The effect of initial dye concentration, initial ferrous ion concentration, pH, initial hydrogen peroxide dosage, and irradiation time are studied. It is found put that the removal rate increases as the initial concentration of H2O2 and ferrous ion increase to optimum value ,where in we get more than 99% removal efficiency of dye at pH = 4 when the [H2O2] = 500mg / L, [Fe + 2 = 150mg / L]. Complete degradation was achieved in the relatively short time of 75 minutes. Faster decolonization is achieved at low pH, with the optimal value at pH 4 .The concentrations of degradation dye are detected by spectr
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
The penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Free Space Optics (FSO) plays a vital role in modern wireless communications due to its advantages over fiber optics and RF techniques where a transmission of huge bandwidth and access to remote places become possible. The specific aim of this research is to analyze the Bit-Error Rate (BER) for FSO communication system when the signal is sent the over medium of turbulence channel, where the fading channel is described by the Gamma-Gamma model. The signal quality is improved by using Optical Space-Time Block- Code (OSTBC) and then the BER will be reduced. Optical 2×2 Alamouti scheme required 14 dB bit energy to noise ratio (Eb/N0) at 10-5 bit error rate (BER) which gives 3.5 dB gain as compared to no diversity scheme. Th
... Show MoreMost of the known cases of strong gravitational lensing involve multiple imaging of an active galactic nucleus. The properties of lensed active galactic nuclei make them promising systems for astrophysical applications of gravitational lensing. So we present a simple model for strong lensing in the gravitational lensed systems to calculate the age of four lensed galaxies, in the present work we take the freedman models with (k curvature index =0) Euclidian case, and the result show a good agreement with the other models.