A mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others in most simulation scenarios according to the integrated mean square error and integrated classification error
The statistical distributions study aimed to obtain on best descriptions of variable sets phenomena, which each of them got one behavior of that distributions . The estimation operations study for that distributions considered of important things which could n't canceled in variable behavior study, as result this research came as trial for reaching to best method for information distribution estimation which is generalized linear failure rate distribution, throughout studying the theoretical sides by depending on statistical posteriori methods like greatest ability, minimum squares method and Mixing method (suggested method).
The research
... Show MoreThe Industrial Banks in most countries in the world plays an essential and vital role in developing the industrial sector due to its effective importance in the economic and social development. The research aims to study the ability of the Industrial Bank of Iraq to the growth and development of the mixed industrial sector companies through loans granted to them. In addition, the research seeks to study credit policies pursued by the Industrial Bank of Iraq to finance the mixed industrial sector companies. The data of the research has been collected from the financial statements of the Industrial Banks of Iraq for the period 2007-2011, and by means of questionnaire distributed to a sample of the mixed industrial sector companies in
... Show MoreThis paper shews how to estimate the parameter of generalized exponential Rayleigh (GER) distribution by three estimation methods. The first one is maximum likelihood estimator method the second one is moment employing estimation method (MEM), the third one is rank set sampling estimator method (RSSEM)The simulation technique is used for all these estimation methods to find the parameters for generalized exponential Rayleigh distribution. Finally using the mean squares error criterion to compare between these estimation methods to find which of these methods are best to the others
Linear programming currently occupies a prominent position in various fields and has wide applications, as its importance lies in being a means of studying the behavior of a large number of systems as well. It is also the simplest and easiest type of models that can be created to address industrial, commercial, military and other dilemmas. Through which to obtain the optimal quantitative value. In this research, we dealt with the post optimality solution, or what is known as sensitivity analysis, using the principle of shadow prices. The scientific solution to any problem is not a complete solution once the optimal solution is reached. Any change in the values of the model constants or what is known as the inputs of the model that will chan
... Show MoreThis study relates to the estimation of a simultaneous equations system for the Tobit model where the dependent variables ( ) are limited, and this will affect the method to choose the good estimator. So, we will use new estimations methods different from the classical methods, which if used in such a case, will produce biased and inconsistent estimators which is (Nelson-Olson) method and Two- Stage limited dependent variables(2SLDV) method to get of estimators that hold characteristics the good estimator .
That is , parameters will be estim
... Show MoreA Modified version of the Generlized standard addition method ( GSAM) was developed. This modified version was used for the quantitative determination of arginine (Arg) and glycine ( Gly) in arginine acetyl salicylate – glycine complex . According to this method two linear equations were solved to obtain the amounts of (Arg) and (Gly). The first equation was obtained by spectrophotometic measurement of the total absorbance of (Arg) and (Gly) colored complex with ninhydrin . The second equation was obtained by measuring the total acid consumed by total amino groups of (Arg) and ( Gly). The titration was carried out in non- aqueous media using perchloric acid in glacial acetic acid as a titrant. The developed metho
... Show MoreHCl is separated from HCl –H2SO4 solution by membrane distillation process(MD). The flat –sheet membranes made from polyvinylidene fluoride (PVDF) and polypropylene (pp.). Plate and frame these types of membrane where used in the process. The feed is a mixture of HCl and H2SO4 acids compositions depended on metals treated object.HCl concentration increased in the permeate during the process but sulfuric acid increased gradually in the feed .During the concentration of solution acids concentrations in the feed at the beginning were 50 g/dm3 of sulfuric acid and 50 g/dm3 of hydrochloric acid at 333K feed temperature the permeate flux was 71 dm
... Show MoreIn this paper, some commonly used hierarchical cluster techniques have been compared. A comparison was made between the agglomerative hierarchical clustering technique and the k-means technique, which includes the k-mean technique, the variant K-means technique, and the bisecting K-means, although the hierarchical cluster technique is considered to be one of the best clustering methods. It has a limited usage due to the time complexity. The results, which are calculated based on the analysis of the characteristics of the cluster algorithms and the nature of the data, showed that the bisecting K-means technique is the best compared to the rest of the other methods used.
Canonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show MoreThis paper discusses the study of computer Russian language neologisms. Problems of studying computer terminology are constantly aggravated by the processes of computer technology that is introduced to all walks of life. The study identifies ways of word formation: the origin of the computer terms and the possibility of their usage in Russian language. The Internet is considered a worldwide tool of communication used extensively by students, housewives and professionals as well The Internet is a heterogeneous environment consisting of various hardware and software configurations that need to be configured to support the languages used. The development of Internet content and services is essential for expanding Internet usage. Some of the
... Show More