In this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the
... Show MoreThe aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.
The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet
... Show MoreThis paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show MoreThe gravity method is a measurement of relatively noticeable variations in the Earth’s gravitational field caused by lateral variations in rock's density. In the current research, a new technique is applied on the previous Bouguer map of gravity surveys (conducted from 1940–1950) of the last century, by selecting certain areas in the South-Western desert of Iraqi-territory within the provinces' administrative boundary of Najaf and Anbar. Depending on the theory of gravity inversion where gravity values could be reflected to density-contrast variations with the depths; so, gravity data inversion can be utilized to calculate the models of density and velocity from four selected depth-slices 9.63 Km, 1.1 Km, 0.682 Km and 0.407 Km.
... Show MoreConditional logistic regression is often used to study the relationship between event outcomes and specific prognostic factors in order to application of logistic regression and utilizing its predictive capabilities into environmental studies. This research seeks to demonstrate a novel approach of implementing conditional logistic regression in environmental research through inference methods predicated on longitudinal data. Thus, statistical analysis of longitudinal data requires methods that can properly take into account the interdependence within-subjects for the response measurements. If this correlation ignored then inferences such as statistical tests and confidence intervals can be invalid largely.
Samarium ions (Sm +3), a rare-earth element, have a significant optical emission within the visible spectrum. PMMA samples, mixed with different ratios of SmCl3.6H2O, were prepared via the casting method. The composite was tested using UV-visible, photoluminescence and thermogravimetric analysis (TGA). The FTIR spectrometry of PMMA samples showed some changes, including variation in band intensity, location, and width. Mixed with samarium decreases the intensity of the CO and CH2 stretching bands and band position. A new band appeared corresponding to ionic bonds between samarium cations with negative branches in the polymer. These variations indicate complex links between the Sm +3 ion and oxygen in the ether group. The optical absorption
... Show MoreAbstract
In this research provide theoretical aspects of one of the most important statistical distributions which it is Lomax, which has many applications in several areas, set of estimation methods was used(MLE,LSE,GWPM) and compare with (RRE) estimation method ,in order to find out best estimation method set of simulation experiment (36) with many replications in order to get mean square error and used it to make compare , simulation experiment contrast with (estimation method, sample size ,value of location and shape parameter) results show that estimation method effected by simulation experiment factors and ability of using other estimation methods such as(Shrinkage, jackknif
... Show MoreFree Radical Copolymerization of Styrene/ Methyl Methacrylate were prepared chemically under Nitrogen ,which was investigated, in the present of Benzoyl Peroxide as Initiator at concentration of 2 × 10-3 molar at 70 °C, which was carried out in Benzene as solvent to a certain low conversion . FT-IR spectra were used for determining of the monomer reactivity ratios ,which was obtained by employing the conventional linearization method of Fineman-Ross (F-R) and Kelen-Tüdos (K- T). The experimental results showed the average value for the Styrene r1 / Methyl Methacrylate r2 system, Sty r1 = 0.45 , MMA r2 = 0.38 in the (F–R) Method and r1 = 0.49 , r2 = 0.35 in the (K–T) Method, The Results of this indicated show the random distri
... Show MoreWithin the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show MoreIn this paper, we present multiple bit error correction coding scheme based on extended Hamming product code combined with type II HARQ using shared resources for on chip interconnect. The shared resources reduce the hardware complexity of the encoder and decoder compared to the existing three stages iterative decoding method for on chip interconnects. The proposed method of decoding achieves 20% and 28% reduction in area and power consumption respectively, with only small increase in decoder delay compared to the existing three stage iterative decoding scheme for multiple bit error correction. The proposed code also achieves excellent improvement in residual flit error rate and up to 58% of total power consumption compared to the other err
... Show More