احكام التركز الاقتصادي للمشاريع دراسة مقارنة
In this paper was discussed the process of compounding two distributions using new compounding procedure which is connect a number of life time distributions ( continuous distribution ) where is the number of these distributions represent random variable distributed according to one of the discrete random distributions . Based on this procedure have been compounding zero – truncated poisson distribution with weibell distribution to produce new life time distribution having three parameter , Advantage of that failure rate function having many cases ( increasing , dicreasing , unimodal , bathtube) , and study the resulting distribution properties such as : expectation , variance , comulative function , reliability function and fa
... Show MoreThe development in manufacturing computers from both (Hardware and Software) sides, make complicated robust estimators became computable and gave us new way of dealing with the data, when classical discriminant methods failed in achieving its optimal properties especially when data contains a percentage of outliers. Thus, the inability to have the minimum probability of misclassification. The research aim to compare robust estimators which are resistant to outlier influence like robust H estimator, robust S estimator and robust MCD estimator, also robustify misclassification probability with showing outlier influence on the percentage of misclassification when using classical methods. ,the other
... Show MoreIn linear regression, an outlier is an observation with large residual. In other words, it is an observation whose dependent-variable value is unusual given its values on the predictor variables. An outlier observation may indicate a data entry error or other problem.
An observation with an extreme value on a predictor variable is a point with high leverage. Leverage is a measure of how far an independent variable deviates from its mean. These leverage points can have an effect on the estimate of regression coefficients.
Robust estimation for regression parameters deals with cases that have very high leverage, and cases that are outliers. Robust estimation is essentially a
... Show MoreThis study was undertaken to introduce a fast, accurate, selective, simple and environment-friendly colorimetric method to determine iron (II) concentration in different lipstick brands imported or manufactured locally in Baghdad, Iraq. The samples were collected from 500-Iraqi dinars stores to establish routine tests using the spectrophotometric method and compared with a new microfluidic paper-based analytical device (µPAD) platform as an alternative to cost-effective conventional instrumentation such as Atomic Absorption Spectroscopy (AAS). This method depends on the reaction between iron (II) with iron(II) selective chelator 1, 10-phenanthroline(phen) in the presence of reducing agent hydroxylamine (HOA) and sodium acetate (NaOAc) b
... Show MoreHere’s a research about epistemology including answer for the most essential questions those which associate with human knowledge, It’s the sources of knowledge and it's pathways depending on showing and criticizing the empirical doctrine represented by John Locke, David Hume and others, not only the empirical doctrine but also the contemporary scientism doctrine the one that characterized by science abilities glorification which is represented by contemporary scientists such as: Stephen Hawking and some few others .The common points between both of the doctrines (scientism and empirical) will be clarified for the reader and also the uncommon ones, ultimately, we will briefly spotlight on the basics of the Islamic vision about knowle
... Show MoreRegression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show Moreيھدف البحث الى اجراء تقدير دالة المعولية لتوزيــع ويبل ذي المعلمتين بالطرائـق المعلميــة والمتمثلة بـ (NWLSM,RRXM,RRYM,MOM,MLM (، وكذلك اجراء تقدير لدالة المعولية بالطرائق الالمعلمية والمتمثلة بـ . (EM, PLEM, EKMEM, WEKM, MKMM, WMR, MMO, MMT) وتم استخدام اسلوب المحاكاة لغرض المقارنة باستخدام حجوم عينات مختلفة (20,40,60,80,100) والوصول الى افضل الطرائق في التقدير باالعتماد على المؤشر االحصائي متوسط مربعات الخطا التكاملي (IMSE(، وقد توصل البحث الى
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show Moreيواجه المصرف تحولات عديدة أثناء سير عمله ولا سيما قد تحول من مصرف صناعي يسعى الى تحقيق التنمية الصناعية ، من خلال منحه قروض وتسهيلات تنموية وتدعمه الدوله ، الى مصرف شامل يسعى الى تحقيق الربحية في ظل تنويع الأنشطة والخدمات والعمليات الائتمانية.يهدف البحث الى دراسة التحولات التي حدثت في المصرف الصناعي، وتأثير هذا التحول على النشاط الائتماني. وقد استند في ذلك على فرضية رئيسة وهي :-
... Show MoreThe repeated measurement design is called a complete randomized block design for repeated measurement when the subject is given the all different treatments , in this case the subject is considered as a block . Many of nonparametric methods were considered like Friedman test (1937) and Koch test(1969) and Kepner&Robinson test(1988) when the assumption of normal distribution of the data is not satisfied .as well as F test when the assumptions of the analysis of variance is satisfied ,where the observations within blocks are assumed to be equally correlated . The purpose of this paper is to summarize the result of the simulation study for comparing these methods as well as present the suggested
Me
... Show More