Characteristic evolving is most serious move that deal with image discrimination. It makes the content of images as ideal as possible. Gaussian blur filter used to eliminate noise and add purity to images. Principal component analysis algorithm is a straightforward and active method to evolve feature vector and to minimize the dimensionality of data set, this paper proposed using the Gaussian blur filter to eliminate noise of images and improve the PCA for feature extraction. The traditional PCA result as total average of recall and precision are (93% ,97%) and for the improved PCA average recall and precision are (98% ,100%), this show that the improved PCA is more effective in recall and precision.
In this paper, a computer simulation is implemented to generate of an optical aberration by means of Zernike polynomials. Defocus, astigmatism, coma, and spherical Zernike aberrations were simulated in a subroutine using MATLAB function and applied as a phase error in the aperture function of an imaging system. The studying demonstrated that the Point Spread Function (PSF) and Modulation Transfer Function (MTF) have been affected by these optical aberrations. Areas under MTF for different radii of the aperture of imaging system have been computed to assess the quality and efficiency of optical imaging systems. Phase conjugation of these types aberration has been utilized in order to correct a distorted wavefront. The results showed that
... Show MoreThe subject of the Internet of Things is very important, especially at present, which is why it has attracted the attention of researchers and scientists due to its importance in human life. Through it, a person can do several things easily, accurately, and in an organized manner. The research addressed important topics, the most important of which are the concept of the Internet of Things, the history of its emergence and development, the reasons for its interest and importance, and its most prominent advantages and characteristics. The research sheds light on the structure of the Internet of Things, its structural components, and its most important components. The research dealt with the most important search engines in the Intern
... Show MoreShadow detection and removal is an important task when dealing with color outdoor images. Shadows are generated by a local and relative absence of light. Shadows are, first of all, a local decrease in the amount of light that reaches a surface. Secondly, they are a local change in the amount of light rejected by a surface toward the observer. Most shadow detection and segmentation methods are based on image analysis. However, some factors will affect the detection result due to the complexity of the circumstances. In this paper a method of segmentation test present to detect shadows from an image and a function concept is used to remove the shadow from an image.
Steganography is a useful technique that helps in securing data in communication using different data carriers like audio, video, image and text. The most popular type of steganography is image steganography. It mostly uses least significant bit (LSB) technique to hide the data but the probability of detecting the hidden data using this technique is high. RGB is a color model which uses LSB to hide the data in three color channels, where each pixel is represented by three bytes to indicate the intensity of red, green and blue in that pixel. In this paper, steganography based RGB image is proposed which depends on genetic algorithm (GA). GA is used to generate random key that represents the best ordering of secret (image/text) blocks to b
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreNeural cryptography deals with the problem of “key exchange” between two neural networks by using the mutual learning concept. The two networks exchange their outputs (in bits) and the key between two communicating parties ar eventually represented in the final learned weights, when the two networks are said to be synchronized. Security of neural synchronization is put at risk if an attacker is capable of synchronizing with any of the two parties during the training process.
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show MoreAs we live in the era of the fourth technological revolution, it has become necessary to use artificial intelligence to generate electric power through sustainable solar energy, especially in Iraq and what it has gone through in terms of crises and what it suffers from a severe shortage of electric power because of the wars and calamities it went through. During that period of time, its impact is still evident in all aspects of daily life experienced by Iraqis because of the remnants of wars, siege, terrorism, wrong policies ruling before and later, regional interventions and their consequences, such as the destruction of electric power stations and the population increase, which must be followed by an increase in electric power stations,
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show More