The process of accurate localization of the basic components of human faces (i.e., eyebrows, eyes, nose, mouth, etc.) from images is an important step in face processing techniques like face tracking, facial expression recognition or face recognition. However, it is a challenging task due to the variations in scale, orientation, pose, facial expressions, partial occlusions and lighting conditions. In the current paper, a scheme includes the method of three-hierarchal stages for facial components extraction is presented; it works regardless of illumination variance. Adaptive linear contrast enhancement methods like gamma correction and contrast stretching are used to simulate the variance in light condition among images. As testing material a subset consists of 1150 images belong to 91 different subjects was taken from Cohn-Kanade AU coded dataset (CK); the subjects images hold different facial expressions. The test results show the effectiveness of the proposed automated localization scheme in different illuminations conditions; it gave accuracy of about 95.7%.
This work implements the face recognition system based on two stages, the first stage is feature extraction stage and the second stage is the classification stage. The feature extraction stage consists of Self-Organizing Maps (SOM) in a hierarchical format in conjunction with Gabor Filters and local image sampling. Different types of SOM’s were used and a comparison between the results from these SOM’s was given.
The next stage is the classification stage, and consists of self-organizing map neural network; the goal of this stage is to find the similar image to the input image. The proposal method algorithm implemented by using C++ packages, this work is successful classifier for a face database consist of 20
... Show MoreAs a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand
... Show MoreIn this work, electron number density calculated using Matlab program code with the writing algorithm of the program. Electron density was calculated using Anisimov model in a vacuum environment. The effect of spatial coordinates on the electron density was investigated in this study. It was found that the Z axis distance direction affects the electron number density (ne). There are many processes such as excitation; ionization and recombination within the plasma that possible affect the density of electrons. The results show that as Z axis distance increases electron number density decreases because of the recombination of electrons and ions at large distances from the target and the loss of thermal energy of the electrons in high distance
... Show MoreBendable concrete, also known as Engineered Cementitious Composite (ECC) is a type of ultra-ductile cementitious composites reinforced with fibres to control the width of cracks. It has the ability to enhance concrete flexibility by withstanding strains of 3% and higher. The properties of bendable concrete mixes (compressive strength, flexural strength, and drying shrinkage) are here assessed after the incorporation of supplementary cementitious materials, silica fume, polymer fibres, and the use of ordinary Portland cement (O.P.C) and Portland limestone cement (IL). Mixes with Portland limestone cement show lower drying shrinkage and lower compressive and flexural strength than mixes with ordinary Portland cement, due to the ratio o
... Show MoreIn this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample
... Show MoreThe problem of solid waste from domestic, industrial, commercial and medical sources is one of the most important problems facing the local administration in all Iraqi cities. The danger of this problem increases with the rapid increase in the population, changing lifestyles, consumption patterns, limited land suitable for landfill, and high costs of collection and disposal. This research aims to solve these problems by determining the locations of current landfills located in the outskirts of Baghdad Governorate. The ArcGIS program was used, where the sites of the landfills were determined on the map and through the available data about the areas. it was concluded that the existing landfill sites do not meet environmental conditions and
... Show MoreKinematics is the mechanics branch which dealswith the movement of the bodies without taking the force into account. In robots, the forward kinematics and inverse kinematics are important in determining the position and orientation of the end-effector to perform multi-tasks. This paper presented the inverse kinematics analysis for a 5 DOF robotic arm using the robotics toolbox of MATLAB and the Denavit-Hartenberg (D-H) parameters were used to represent the links and joints of the robotic arm. A geometric approach was used in the inverse kinematics solution to determine the joints angles of the robotic arm and the path of the robotic arm was divided into successive lines to accomplish the required tasks of the robotic arm.Therefore, this
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show More