This research deals with a shrinking method concerned with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained variance in the principal component case.
Reverse osmosis membrane desalination is one of the most significant water treatments that is used to offer freshwater. The aim of this research is to study the effect of controlling the value of the zeta potential on the suspended particles in the water and the proximity of the membrane surfaces in the colloidal solution, to keep the water stable electrically and disperse the colloidal particles. To achieve this aim, the experimental study was conducted in the Sanitary Engineering Laboratory, in the engineering college - University of Baghdad. Two systems were set up, one worked normally and the other worked by using the zeta rod placed before the reverse osmosis membrane. The results showed that the effect of the zeta rod increas
... Show Moreيدرس هذا البحث طرائق اختزال الابعاد التي تعمل على تجاوز مشكلة البعدية عندما تفشل الطرائق التقليدية في ايجاد تقدير جيد للمعلمات، لذلك يتوجب التعامل مع هذه المشكلة بشكل مباشر. ومن اجل ذلك، يجب التخلص من هذه المشكلة لذا تم استعمال اسلوبين لحل مشكلة البيانات ذات الابعاد العالية الاسلوب الاول طريقة الانحدار الشرائحي المعكوس SIR ) ) والتي تعتبر طريقة غير كلاسيكية وكذلك طريقة ( WSIR ) المقترحة والاسلوب الثاني طري
... Show MoreThe increase globally fossil fuel consumption as it represents the main source of energy around the world, and the sources of heavy oil more than light, different techniques were used to reduce the viscosity and increase mobility of heavy crude oil. this study focusing on the experimental tests and modeling with Back Feed Forward Artificial Neural Network (BFF-ANN) of the dilution technique to reduce a heavy oil viscosity that was collected from the south- Iraq oil fields using organic solvents, organic diluents with different weight percentage (5, 10 and 20 wt.% ) of (n-heptane, toluene, and a mixture of different ratio
... Show MoreAbstract
The current research sought to demonstrate the effect of material flow cost accounting on reducing products through the application of material flow cost accounting technique, which works on the optimal utilization of materials and energy and the reduction of environmental impacts.The research aims to clarify the knowledge foundations for material flow cost accounting, in addition to studying the material flow cost accounting technique that helps reduce the cost of products and make them environmentally friendly. To achieve this, the research relied on the descriptive approach with regard to the theoretical aspect of the resea
... Show MoreThe synthesis of nanoparticles (GNPs) from the reduction of HAuCl4 .3H2O by aluminum metal was obtained in aqueous solution with the use of Arabic gum as a stabilizing agent. The GNPs were characterized by TEM, AFM and Zeta potential spectroscopy. The reduction process was monitored over time by measuring ultraviolet spectra at a range of λ 520-525 nm. Also the color changes from yellow to ruby red, shape and size of GNP was studied by TEM. Shape was spherical and the size of particles was (12-17.5) nm. The best results were obtained at pH 6.
Today, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThe computer vision branch of the artificial intelligence field is concerned with developing algorithms for analyzing video image content. Extracting edge information, which is the essential process in most pictorial pattern recognition problems. A new method of edge detection technique has been introduces in this research, for detecting boundaries.
Selection of typical lossy techniques for encoding edge video images are also discussed in this research. The concentration is devoted to discuss the Block-Truncation coding technique and Discrete Cosine Transform (DCT) coding technique. In order to reduce the volume of pictorial data which one may need to store or transmit,
... Show MoreEstimating an individual's age from a photograph of their face is critical in many applications, including intelligence and defense, border security and human-machine interaction, as well as soft biometric recognition. There has been recent progress in this discipline that focuses on the idea of deep learning. These solutions need the creation and training of deep neural networks for the sole purpose of resolving this issue. In addition, pre-trained deep neural networks are utilized in the research process for the purpose of facial recognition and fine-tuning for accurate outcomes. The purpose of this study was to offer a method for estimating human ages from the frontal view of the face in a manner that is as accurate as possible and takes
... Show MoreWe have studied Bayesian method in this paper by using the modified exponential growth model, where this model is more using to represent the growth phenomena. We focus on three of prior functions (Informative, Natural Conjugate, and the function that depends on previous experiments) to use it in the Bayesian method. Where almost of observations for the growth phenomena are depended on one another, which in turn leads to a correlation between those observations, which calls to treat such this problem, called Autocorrelation, and to verified this has been used Bayesian method.
The goal of this study is to knowledge the effect of Autocorrelation on the estimation by using Bayesian method. F
... Show More
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.