This study was conducted in College of Science \ Computer Science Department \ University of Baghdad to compare between automatic sorting and manual sorting, which is more efficient and accurate, as well as the use of artificial intelligence in automated sorting, which included artificial neural network, image processing, study of external characteristics, defects and impurities and physical characteristics; grading and sorting speed, and fruits weigh. the results shown value of impurities and defects. the highest value of the regression is 0.40 and the error-approximation algorithm has recorded the value 06-1 and weight fruits fruit recorded the highest value and was 138.20 g, Grading and sorting speed recorded the highest value and was 1.38 minutes.
The purpose of this research is to determine the extent to which independent auditors can audit the requirements of e-commerce related to (infrastructure requirements, legislation and regulations, tax laws, and finally human cadres). To achieve this, a questionnaire was designed for auditors. Numerous statistical methods, namely arithmetic mean and standard deviation, have been used through the implementation of the Statistical Packages for Social Sciences (SPSS) program.
The research has reached several results, the most important of which are: There are noobstacles to enabling the auditor to audit the application of the e-commerce requirements as well as the respective(infrastructure requirements, legislation and regulations, t
... Show MoreThis research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreIn this paper has been one study of autoregressive generalized conditional heteroscedasticity models existence of the seasonal component, for the purpose applied to the daily financial data at high frequency is characterized by Heteroscedasticity seasonal conditional, it has been depending on Multiplicative seasonal Generalized Autoregressive Conditional Heteroscedastic Models Which is symbolized by the Acronym (SGARCH) , which has proven effective expression of seasonal phenomenon as opposed to the usual GARCH models. The summarizing of the research work studying the daily data for the price of the dinar exchange rate against the dollar, has been used autocorrelation function to detect seasonal first, then was diagnosed wi
... Show More
Companies compete greatly with each other today, so they need to focus on innovation to develop their products and make them competitive. Lean product development is the ideal way to develop product, foster innovation, maximize value, and reduce time. Set-Based Concurrent Engineering (SBCE) is an approved lean product improvement mechanism that builds on the creation of a number of alternative designs at the subsystem level. These designs are simultaneously improved and tested, and the weaker choices are removed gradually until the optimum solution is reached finally. SBCE implementations have been extensively performed in the automotive industry and there are a few case studies in the aerospace industry. This research describe the use o
... Show MoreIntended for getting good estimates with more accurate results, we must choose the appropriate method of estimation. Most of the equations in classical methods are linear equations and finding analytical solutions to such equations is very difficult. Some estimators are inefficient because of problems in solving these equations. In this paper, we will estimate the survival function of censored data by using one of the most important artificial intelligence algorithms that is called the genetic algorithm to get optimal estimates for parameters Weibull distribution with two parameters. This leads to optimal estimates of the survival function. The genetic algorithm is employed in the method of moment, the least squares method and the weighted
... Show MoreAccording to the European Union Water Framework Directive requirements, diatom metrics were used to assess the ecological status of surface waters in the Gaziantep central catchment (Turkey). A total of 42 diatom taxa were identified. A few environmental factors (especially lead, copper, orthophosphate, and chromium) played significant roles on the distribution of diatom assemblages among the sampling stations. The first two axes of the canonical correspondence analysis elucidated 91.6 % of the species–environment correlations with 13.9 % of the cumulative variance of species. The applied diatom indices (TIT – Trophic Index Turkey, TI – Trophic Index, and EPI-D – Eutrophication and/or Pollution Index-Diatom) showed different results
... Show MoreWeibull distribution is considered as one of the most widely distribution applied in real life, Its similar to normal distribution in the way of applications, it's also considered as one of the distributions that can applied in many fields such as industrial engineering to represent replaced and manufacturing time ,weather forecasting, and other scientific uses in reliability studies and survival function in medical and communication engineering fields.
In this paper, The scale parameter has been estimated for weibull distribution using Bayesian method based on Jeffery prior information as a first method , then enhanced by improving Jeffery prior information and then used as a se
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values