The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To deal with this type of problem, a mixture of linear regression is used to model such data. In this article, we propose a genetic algorithm-based method combined with (MM-estimator), which is called in this article (RobGA), to improve the accuracy of the estimation in the final stage. We compare the suggested method with robust bi-square (MixBi) in terms of their application to real data representing blood sample. The results showed that RobGA is more efficient in estimating the parameters of the model than the MixBi method with respect to mean square error (MSE) and classification error (CE).
This research provides a study of the virtual museums features and characteristics and contributes to the recognition of the diversity of visual presentation methods, as the virtual museums give the act of participation and visual communication with programs at an open time, so that it would contribute to reflection, thinking and recording notes, developing the actual and innovative skills through seeing the environments. The study has been divided into two sections the first one is virtual museum techniques. The techniques were studied to reach the public and are used remotely by the services of personal computers or smart phones being virtual libraries that store images and information that was formed and built in a digital way and how
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show MoreThis research aims primarily to highlight personal tax exemptions A comparative study with some Arab and European regulations. And by conducting both theoretical comparative analyses. Most important findings of the study is the need to grant personal and family exemptions that differ according to the civil status of the taxpayer (single or married). In other words, the exemption increases as the number of family members depend on its social sense. Also taking into account some incomes that require a certain effort and looking at the tax rates, it is unreasonable for wages to be subject to the same rates applied to commercial profits.
Scheduling considered being one of the most fundamental and essential bases of the project management. Several methods are used for project scheduling such as CPM, PERT and GERT. Since too many uncertainties are involved in methods for estimating the duration and cost of activities, these methods lack the capability of modeling practical projects. Although schedules can be developed for construction projects at early stage, there is always a possibility for unexpected material or technical shortages during construction stage. The objective of this research is to build a fuzzy mathematical model including time cost tradeoff and resource constraints analysis to be applied concurrently. The proposed model has been formulated using fuzzy the
... Show MoreAbstract
In this paper, fatigue damage accumulation were studied using many methods i.e.Corton-Dalon (CD),Corton-Dalon-Marsh(CDM), new non-linear model and experimental method. The prediction of fatigue lifetimes based on the two classical methods, Corton-Dalon (CD)andCorton-Dalon-Marsh (CDM), are uneconomic and non-conservative respectively. However satisfactory predictions were obtained by applying the proposed non-linear model (present model) for medium carbon steel compared with experimental work. Many shortcomings of the two classical methods are related to their inability to take into account the surface treatment effect as shot peening. It is clear that the new model shows that a much better and cons
... Show MoreThe comparison of double informative priors which are assumed for the reliability function of Pareto type I distribution. To estimate the reliability function of Pareto type I distribution by using Bayes estimation, will be used two different kind of information in the Bayes estimation; two different priors have been selected for the parameter of Pareto type I distribution . Assuming distribution of three double prior’s chi- gamma squared distribution, gamma - erlang distribution, and erlang- exponential distribution as double priors. The results of the derivaties of these estimators under the squared error loss function with two different double priors. Using the simulation technique, to compare the performance for
... Show MoreBackground: Morphology of the root canal system is divergent and unpredictable, and rather linked to clinical complications, which directly affect the treatment outcome. This objective necessitates continuous informative update of the effective clinical and laboratory methods for identifying this anatomy, and classification systems suitable for communication and interpretation in different situations. Data: Only electronic published papers were searched within this review. Sources: “PubMed” website was the only source used to search for data by using the following keywords "root", "canal", "morphology", "classification". Study selection: 153 most relevant papers to the topic were selected, especially the original articles and review pa
... Show MoreSupport vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca
... Show MoreThe objective of this research paper is two-fold. The first is a precise reading of the theoretical underpinnings of each of the strategic approaches: "Market approach" for (M. Porter), and the alternative resource-based approach (R B V), advocates for the idea that the two approaches are complementary. Secondly, we will discuss the possibility of combining the two competitive strategies: cost leadership and differentiation. Finally, we propose a consensual approach that we call "dual domination".