This study was conducted in College of Science \ Computer Science Department \ University of Baghdad to compare between automatic sorting and manual sorting, which is more efficient and accurate, as well as the use of artificial intelligence in automated sorting, which included artificial neural network, image processing, study of external characteristics, defects and impurities and physical characteristics; grading and sorting speed, and fruits weigh. the results shown value of impurities and defects. the highest value of the regression is 0.40 and the error-approximation algorithm has recorded the value 06-1 and weight fruits fruit recorded the highest value and was 138.20 g, Grading and sorting speed recorded the highest value and was 1.38 minutes.
A multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i
... Show MoreExtractive multi-document text summarization – a summarization with the aim of removing redundant information in a document collection while preserving its salient sentences – has recently enjoyed a large interest in proposing automatic models. This paper proposes an extractive multi-document text summarization model based on genetic algorithm (GA). First, the problem is modeled as a discrete optimization problem and a specific fitness function is designed to effectively cope with the proposed model. Then, a binary-encoded representation together with a heuristic mutation and a local repair operators are proposed to characterize the adopted GA. Experiments are applied to ten topics from Document Understanding Conference DUC2002 datas
... Show MoreIdentifying people by their ear has recently received import attention in the literature. The accurate segmentation of the ear region is vital in order to make successful person identification decisions. This paper presents an effective approach for ear region segmentation from color ear images. Firstly, the RGB color model was converted to the HSV color model. Secondly, thresholding was utilized to segment the ear region. Finally, the morphological operations were applied to remove small islands and fill the gaps. The proposed method was tested on a database which consisted of 105 ear images taken from the right sides of 105 subjects. The experimental results of the proposed approach on a variety of ear images revealed that this approac
... Show MoreCox regression model have been used to estimate proportion hazard model for patients with hepatitis disease recorded in Gastrointestinal and Hepatic diseases Hospital in Iraq for (2002 -2005). Data consists of (age, gender, survival time terminal stat). A Kaplan-Meier method has been applied to estimate survival function and hazerd function.
Simulation Study
Abstract :
Robust statistics Known as, Resistance to mistakes resulting of the deviation of Check hypotheses of statistical properties ( Adjacent Unbiased , The Efficiency of data taken from a wide range of probability distributions follow a normal distribution or a mixture of other distributions with different standard deviations.
power spectrum function lead to, President role in the analysis of Stationary random processes, organized according to time, may be discrete random variables or continuous. Measuring its total capacity as frequency function.
Estimation methods Share with
... Show MoreInformation systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show Moremodel is derived, and the methodology is given in detail. The model is constructed depending on some measurement criteria, Akaike and Bayesian information criterion. For the new time series model, a new algorithm has been generated. The forecasting process, one and two steps ahead, is discussed in detail. Some exploratory data analysis is given in the beginning. The best model is selected based on some criteria; it is compared with some naïve models. The modified model is applied to a monthly chemical sales dataset (January 1992 to Dec 2019), where the dataset in this work has been downloaded from the United States of America census (www.census.gov). Ultimately, the forecasted sales
The Zubair reservoir in the Abu-Amood field is considered a shaly sand reservoir in the south of Iraq. The geological model is created for identifying the facies, distributing the petrophysical properties and estimating the volume of hydrocarbon in place. When the data processing by Interactive Petrophysics (IP) software is completed and estimated the permeability reservoir by using the hydraulic unit method then, three main steps are applied to build the geological model, begins with creating a structural, facies and property models. five zones the reservoirs were divided (three reservoir units and two cap rocks) depending on the variation of petrophysical properties (porosity and permeability) that results from IP software interpr
... Show MoreA rapid growth has occurred for the act of plagiarism with the aid of Internet explosive growth wherein a massive volume of information offered with effortless use and access makes plagiarism the process of taking someone else’s work (represented by ideas, or even words) and representing it as other's own work easy to be performed. For ensuring originality, detecting plagiarism has been massively necessitated in various areas so that the people who aim to plagiarize ought to offer considerable effort for introducing works centered on their research.
In this paper, work has been proposed for improving the detection of textual plagiarism through proposing a model for can
... Show More