This study was conducted in College of Science \ Computer Science Department \ University of Baghdad to compare between automatic sorting and manual sorting, which is more efficient and accurate, as well as the use of artificial intelligence in automated sorting, which included artificial neural network, image processing, study of external characteristics, defects and impurities and physical characteristics; grading and sorting speed, and fruits weigh. the results shown value of impurities and defects. the highest value of the regression is 0.40 and the error-approximation algorithm has recorded the value 06-1 and weight fruits fruit recorded the highest value and was 138.20 g, Grading and sorting speed recorded the highest value and was 1.38 minutes.
The concept of the active contour model has been extensively utilized in the segmentation and analysis of images. This technology has been effectively employed in identifying the contours in object recognition, computer graphics and vision, biomedical processing of images that is normal images or medical images such as Magnetic Resonance Images (MRI), X-rays, plus Ultrasound imaging. Three colleagues, Kass, Witkin and Terzopoulos developed this energy, lessening “Active Contour Models” (equally identified as Snake) back in 1987. Being curved in nature, snakes are characterized in an image field and are capable of being set in motion by external and internal forces within image data and the curve itself in that order. The present s
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show More
It is considered as one of the statistical methods used to describe and estimate the relationship between randomness (Y) and explanatory variables (X). The second is the homogeneity of the variance, in which the dependent variable is a binary response takes two values (One when a specific event occurred and zero when that event did not happen) such as (injured and uninjured, married and unmarried) and that a large number of explanatory variables led to the emergence of the problem of linear multiplicity that makes the estimates inaccurate, and the method of greatest possibility and the method of declination of the letter was used in estimating A double-response logistic regression model by adopting the Jackna
... Show MoreThe purpose of current study is to analyze the computer textbooks content for intermediate stage in Iraq according to the theory of multiple intelligence. By answering the following question “what is the percentage of availability of multiple intelligence in the content of the computer textbooks on intermediate stage (grade I, II) for the academic year (2017-2018)? The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea for registration. The research tool was prepared according the Gardner’s classification of multiple intelligence. It has proven validity and reliability. The study found the percentage of multiple intelligence in the content of computer textbooks for the in
... Show MoreThe non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MoreThe nucleon momentum distributions (NMD) and elastic electron scattering form factors of the ground state for some 1f-2p-shell nuclei, such as 58Ni, 60Ni, 62Ni, and 64Ni
isotopes have been calculated in the framework of the coherent fluctuation model (CFM) and expressed in terms of the weight function lf(x)l2 . The weight function (fluctuation function) has been related to the nucleon density distribution (NDD) of the nuclei and determined from the theory and experiment. The NDD is derived from a simple method based on the use of the single particle wave functions of the harmonic oscillator potential and the occupation numbers of the states. The feature of the l
In this research, an analysis for the standard Hueckel edge detection algorithm behaviour by using three dimensional representations for the edge goodness criterion is presents after applying it on a real high texture satellite image, where the edge goodness criterion is analysis statistically. The Hueckel edge detection algorithm showed a forward exponential relationship between the execution time with the used disk radius. Hueckel restrictions that mentioned in his papers are adopted in this research. A discussion for the resultant edge shape and malformation is presented, since this is the first practical study of applying Hueckel edge detection algorithm on a real high texture image containing ramp edges (satellite image).
In this study, dynamic encryption techniques are explored as an image cipher method to generate S-boxes similar to AES S-boxes with the help of a private key belonging to the user and enable images to be encrypted or decrypted using S-boxes. This study consists of two stages: the dynamic generation of the S-box method and the encryption-decryption method. S-boxes should have a non-linear structure, and for this reason, K/DSA (Knutt Durstenfeld Shuffle Algorithm), which is one of the pseudo-random techniques, is used to generate S-boxes dynamically. The biggest advantage of this approach is the production of the inverted S-box with the S-box. Compared to the methods in the literature, the need to store the S-box is eliminated. Also, the fabr
... Show More