This paper deals to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in the cultivation of wheat crop in Iraq, where he was be considered the amount of production is the basic data (variable primary) and want to estimate a single points of non measured and the cultivated area (variable secondary) has been programming all calculations language Matlab
The increasing amount of educational data has rapidly in the latest few years. The Educational Data Mining (EDM) techniques are utilized to detect the valuable pattern so that improves the educational process and to obtain high performance of all educational elements. The proposed work contains three stages: preprocessing, features selection, and an active classification stage. The dataset was collected using EDM that had a lack in the label data, it contained 2050 records collected by using questionnaires and by using the students’ academic records. There are twenty-five features that were combined from the following five factors: (curriculum, teacher, student, the environment of education, and the family). Active learning ha
... Show MoreThis review explores the Knowledge Discovery Database (KDD) approach, which supports the bioinformatics domain to progress efficiently, and illustrate their relationship with data mining. Thus, it is important to extract advantages of Data Mining (DM) strategy management such as effectively stressing its role in cost control, which is the principle of competitive intelligence, and the role of it in information management. As well as, its ability to discover hidden knowledge. However, there are many challenges such as inaccurate, hand-written data, and analyzing a large amount of variant information for extracting useful knowledge by using DM strategies. These strategies are successfully applied in several applications as data wa
... Show MoreTor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show MoreThe distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.
Abstract
The analysis of Least Squares: LS is often unsuccessful in the case of outliers in the studied phenomena. OLS will lose their properties and then lose the property of Beast Linear Unbiased Estimator (BLUE), because of the Outliers have a bad effect on the phenomenon. To address this problem, new statistical methods have been developed so that they are not easily affected by outliers. These methods are characterized by robustness or (resistance). The Least Trimmed Squares: LTS method was therefore a good alternative to achieving more feasible results and optimization. However, it is possible to assume weights that take into consideration the location of the outliers in the data and det
... Show MoreA dispersive liquid-liquid microextraction combines with UV-V is spectrophotometry for the preconcentration and determination of Mefenamic acid in pharmaceutical preparation was developed and introduced. The proposed method is based on the formation of charge transfer complexation between mefenamic acid and chloranil as an n-electron donor and a p-acceptor, respectively to form a violet chromogen complex measured at 542 nm. The important parameters affecting the efficiency of DLLME were evaluated and optimized. Under the optimum conditions, the calibration graphs of standard and drug, were ranged 0.03-10 µg mL-1. The limits of detection, quantification and Sandell's sensitivity were calculated. Good recoveries of MAF Std. and drug at 0.05,
... Show MoreKufa, considered one of the important cities in Iraq, is facing a rapid increase in population proportion and urban development in buildings and industry. Therefore, the concentration of several hazardous heavy metals is the main focus of this study. It presents the distribution and Estimation of heavy metals in urban lands in the Kufa area as an environmental geochemical study. Twenty samples of urban surface soils were collected in many sites to determine concentrations, distribution, and contamination of elements Cu, Zn, Co, Ni, Th, U, Pb, Hf, Nb, and Fe. The mean concentrations of heavy metals were compared with the local studies, UCC guidelines, and the world reference. To distinguish ant
... Show MoreThe estimation of the initial oil in place is a crucial topic in the period of exploration, appraisal, and development of the reservoir. In the current work, two conventional methods were used to determine the Initial Oil in Place. These two methods are a volumetric method and a reservoir simulation method. Moreover, each method requires a type of data whereet al the volumetric method depends on geological, core, well log and petrophysical properties data while the reservoir simulation method also needs capillary pressure versus water saturation, fluid production and static pressure data for all active wells at the Mishrif reservoir. The petrophysical properties for the studied reservoir is calculated using neural network technique
... Show MoreThis paper concerned with estimation reliability ( for K components parallel system of the stress-strength model with non-identical components which is subjected to a common stress, when the stress and strength follow the Generalized Exponential Distribution (GED) with unknown shape parameter α and the known scale parameter θ (θ=1) to be common. Different shrinkage estimation methods will be considered to estimate  depending on maximum likelihood estimator and prior estimates based on simulation using mean squared error (MSE) criteria. The study approved that the shrinkage estimation using shrinkage weight function was the best.
Rheumatoid arthritis (AR) is one of the chronic diseases resulting in many complications such as cardiovascular disease (CVD). Any change in the lipid profiles and myocardial markers indicates cardiovascular disease risk, so this study is designed to monitor the pattern of lipid profiles and myocardial markers in newly diagnosed RA patients. Blood samples were collected from 70 Iraqi patients newly diagnosed with rheumatoid arthritis (male and female) and 30 healthy served as control. These individuals were aged 35-65 years. The serum samples were obtained to determine myocardial markers; included troponin, creatinine kinase (CK), lactate dehydrogenase (LDH), and glutamic oxaloacetic transaminase GOT; and lipid profiles; such as choleste
... Show More