In this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the methods of the robust circular S method in the case that the data does not contain outlier values because it was recorded the lowest mean criterion, mean squares error (Median MSE), the least median standard error (Median SE) and the largest value of the criterion of the mean cosines of the circular residuals A(K) for all proposed sample sizes (n=20, 50, 100). In the case of the contaminant in the vertical data, it was found that the circular least squares method is not preferred at all contaminant rates and for all sample sizes, and the higher the percentage of contamination in the vertical data, the greater the preference of the validity of estimation methods, where the mean criterion of median squares of error (Median MSE) and criterion of median standard error (Median SE) decrease and the value of the mean criterion of the mean cosines of the circular residuals A(K) increases for all proposed sample sizes. In the case of the contaminant at high lifting points, the circular least squares method is not preferred by a large percentage at all levels of contaminant and for all sample sizes, and the higher the percentage of the contaminant at the lifting points, the greater the preference of the validity estimation methods, so that the mean criterion of mean squares of error (Median MSE) and criterion of median standard error (Median SE) decrease, and the value of the mean criterion increases for the mean cosines of the circular residuals A(K) and for all sample sizes.
Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show MoreThe present work aims to study the efficiency of using aluminum refuse, which is available locally (after dissolving it in sodium hydroxide), with different coagulants like alum [Al2 (SO4)3.18H2O], Ferric chloride FeCl3 and polyaluminum chloride (PACl) to improve the quality of water. The results showed that using this coagulant in the flocculation process gave high results in the removal of turbidity as well as improving the quality of water by precipitating a great deal of ions causing hardness. From the experimental results of the Jar test, the optimum alum dosages are (25, 50 and 70 ppm), ferric chloride dosages are (15, 40 and 60 ppm) and polyaluminum chloride dosages were (10, 35 and 55 ppm) for initial water turbidity (100, 500 an
... Show MoreWind energy is one of the most common and natural resources that play a huge role in energy sector, and due to the increasing demand to improve the efficiency of wind turbines and the development of the energy field, improvements have been made to design a suitable wind turbine and obtain the most energy efficiency possible from wind. In this paper, a horizontal wind turbine blade operating under low wind speed was designed using the (BEM) theory, where the design of the turbine rotor blade is a difficult task due to the calculations involved in the design process. To understand the behavior of the turbine blade, the QBlade program was used to design and simulate the turbine rotor blade during working conditions. The design variables suc
... Show MoreThe increased use of hybrid PET /CT scanners combining detailed anatomical information along withfunctional data has benefits for both diagnostic and therapeutic purposes. This presented study is to makecomparison of cross sections to produce 18F , 82Sr and68Ge via different reactions with particle incident energy up to 60 MeV as a part of systematic studies on particle-induced activations on enriched natNe, natRb, natGa 18O,85Rb, and 69Ga targets, theoretical calculation of production yield, calculation of requiredtarget and suggestion of optimum reaction to produce: Fluorine-18 , Strontium-82 andGermanium-68 touse in Hybrid Machines PET/CT Scanners.
Text categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show More