In this paper, the Monte-Carlo simulation method was used to compare the robust circular S estimator with the circular Least squares method in the case of no outlier data and in the case of the presence of an outlier in the data through two trends, the first is contaminant with high inflection points that represents contaminant in the circular independent variable, and the second the contaminant in the vertical variable that represents the circular dependent variable using three comparison criteria, the median standard error (Median SE), the median of the mean squares of error (Median MSE), and the median of the mean cosines of the circular residuals (Median A(k)). It was concluded that the method of least squares is better than the methods of the robust circular S method in the case that the data does not contain outlier values because it was recorded the lowest mean criterion, mean squares error (Median MSE), the least median standard error (Median SE) and the largest value of the criterion of the mean cosines of the circular residuals A(K) for all proposed sample sizes (n=20, 50, 100). In the case of the contaminant in the vertical data, it was found that the circular least squares method is not preferred at all contaminant rates and for all sample sizes, and the higher the percentage of contamination in the vertical data, the greater the preference of the validity of estimation methods, where the mean criterion of median squares of error (Median MSE) and criterion of median standard error (Median SE) decrease and the value of the mean criterion of the mean cosines of the circular residuals A(K) increases for all proposed sample sizes. In the case of the contaminant at high lifting points, the circular least squares method is not preferred by a large percentage at all levels of contaminant and for all sample sizes, and the higher the percentage of the contaminant at the lifting points, the greater the preference of the validity estimation methods, so that the mean criterion of mean squares of error (Median MSE) and criterion of median standard error (Median SE) decrease, and the value of the mean criterion increases for the mean cosines of the circular residuals A(K) and for all sample sizes.
Codes of red, green, and blue data (RGB) extracted from a lab-fabricated colorimeter device were used to build a proposed classifier with the objective of classifying colors of objects based on defined categories of fundamental colors. Primary, secondary, and tertiary colors namely red, green, orange, yellow, pink, purple, blue, brown, grey, white, and black, were employed in machine learning (ML) by applying an artificial neural network (ANN) algorithm using Python. The classifier, which was based on the ANN algorithm, required a definition of the mentioned eleven colors in the form of RGB codes in order to acquire the capability of classification. The software's capacity to forecast the color of the code that belongs to an object under de
... Show MoreThe electric quadrupole moments for some scandium isotopes (41, 43, 44, 45, 46, 47Sc) have been calculated using the shell model in the proton-neutron formalism. Excitations out of major shell model space were taken into account through a microscopic theory which is called core polarization effectives. The set of effective charges adopted in the theoretical calculations emerging about the core polarization effect. NushellX@MSU code was used to calculate one body density matrix (OBDM). The simple harmonic oscillator potential has been used to generate the single particle matrix elements. Our theoretical calculations for the quadrupole moments used the two types of effective interactions to obtain the best interaction compared with the exp
... Show MoreThe issue of liquidity, profitability, and money employment, and capital fullness is one of the most important issues that gained high consideration by other authors and researchers in their attempts to find out the real relationship and how can balance be achieved, which is the main goal of each deposits.
For the sake of comprising the study variables, the research has formed the problem of the study which refers to the bank capability to enlarge profits without dissipation in liquidity of the bank which will negatively reflect on the bank's fame as well as the customers' trust. For all these matters, the researcher has proposed a set of aims, the important of which is the estimation of the bank profitability; liquid
... Show MoreAbstract
The grey system model GM(1,1) is the model of the prediction of the time series and the basis of the grey theory. This research presents the methods for estimating parameters of the grey model GM(1,1) is the accumulative method (ACC), the exponential method (EXP), modified exponential method (Mod EXP) and the Particle Swarm Optimization method (PSO). These methods were compared based on the Mean square error (MSE) and the Mean Absolute percentage error (MAPE) as a basis comparator and the simulation method was adopted for the best of the four methods, The best method was obtained and then applied to real data. This data represents the consumption rate of two types of oils a he
... Show MoreDBN Rashid, Astra Salvensis, 2018 - Cited by 1
BN Rashid, Nasaq, 2015
Vehicular ad hoc network (VANET) is a distinctive form of Mobile Ad hoc Network (MANET) that has attracted increasing research attention recently. The purpose of this study is to comprehensively investigate the elements constituting a VANET system and to address several challenges that have to be overcome to enable a reliable wireless communications within a vehicular environment. Furthermore, the study undertakes a survey of the taxonomy of existing VANET routing protocols, with particular emphasis on the strengths and limitations of these protocols in order to help solve VANET routing issues. Moreover, as mobile users demand constant network access regardless of their location, this study seeks to evaluate various mobility models for vehi
... Show MoreThis paper deals with the modeling of a preventive maintenance strategy applied to a single-unit system subject to random failures.
According to this policy, the system is subjected to imperfect periodic preventive maintenance restoring it to ‘as good as new’ with probability
p and leaving it at state ‘as bad as old’ with probability q. Imperfect repairs are performed following failures occurring between consecutive
preventive maintenance actions, i.e the times between failures follow a decreasing quasi-renewal process with parameter a. Considering the
average durations of the preventive and corrective maintenance actions a
... Show MoreABSTRICT:
This study is concerned with the estimation of constant and time-varying parameters in non-linear ordinary differential equations, which do not have analytical solutions. The estimation is done in a multi-stage method where constant and time-varying parameters are estimated in a straight sequential way from several stages. In the first stage, the model of the differential equations is converted to a regression model that includes the state variables with their derivatives and then the estimation of the state variables and their derivatives in a penalized splines method and compensating the estimations in the regression model. In the second stage, the pseudo- least squares method was used to es
... Show MoreThis work addressed the assignment problem (AP) based on fuzzy costs, where the objective, in this study, is to minimize the cost. A triangular, or trapezoidal, fuzzy numbers were assigned for each fuzzy cost. In addition, the assignment models were applied on linguistic variables which were initially converted to quantitative fuzzy data by using the Yager’sorankingi method. The paper results have showed that the quantitative date have a considerable effect when considered in fuzzy-mathematic models.