The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To deal with this type of problem, a mixture of linear regression is used to model such data. In this article, we propose a genetic algorithm-based method combined with (MM-estimator), which is called in this article (RobGA), to improve the accuracy of the estimation in the final stage. We compare the suggested method with robust bi-square (MixBi) in terms of their application to real data representing blood sample. The results showed that RobGA is more efficient in estimating the parameters of the model than the MixBi method with respect to mean square error (MSE) and classification error (CE).
The atmospheric air cold plasma has been used to manufacture gold nanomaterials for treating parasitic leishmaniasis. This study experimentally assessed the treatment of Leishmania parasites (L. donovani and L. tropica) by gold nanoparticles. Specifically, atmospheric pressure nonthermal plasma was generated using different diameters (1.0, 2.8, 3.8 and 4.3 mm) of high voltage electrode. Aqueous gold tetrachloride salts (HAuCl4·4H2O) were used as precursor to produce gold nanoparticles. UV-vis spectroscopy and x-ray diffraction were conducted for characterization of the nanoparticles. The optimum condition (a diameter of 1 mm) was chosen to prepare gold nanoparticles, where the grain size was found to be 17 nm. Accordingly, the nanoparticle
... Show MoreNatural bitumen (NB) is a highly precious material and has drawn increasing attention due to its unique properties, especially since it is available in large quantities and has been used in limited fields. In this research, the exploitation of NB from sulfur springs as an alternative energy resource in the production of asphalt pavement is evaluated. It can be concluded from the experimental results that the chemical composition and surface morphology of NB samples are different from those of base asphalt. Besides, the rheological properties of virgin NB are not sufficient for paving work. To overcome this obstacle, NB from five different springs is modified with limestone filler (LSF) to enhance its properties. LSF is a natural material an
... Show MoreEnvironmental factors that damage plant cells by dehydrating them, such cold, drought, and high salinity, are the most common environmental stresses that have an impact on plant growth, development, and productivity in cultivated regions around the world. Several types of plants have several drought, salinity, and cold inducible genes that make them tolerant to environmental challenges. The purpose of this study was to investigate several species in
Abstract
The population is sets of vocabulary common in character or characters and it’s study subject or research . statistically , this sets is called study population (or abridgement population ) such as set of person or trees of special kind of fruits or animals or product any country for any commodity through infinite temporal period term ... etc.
The population maybe finite if we can enclose the number of its members such as the students of finite school grade . and maybe infinite if we can not enclose the number of it is members such as stars or aquatic creatures in the sea . when we study any character for population the statistical data is concentrate by two metho
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreIn this paper, we describe the cases of marriage and divorce in the city of Baghdad on both sides of Rusafa and Karkh, we collected the data in this research from the Supreme Judicial Council and used the cubic spline interpolation method to estimate the function that passing through given points as well as the extrapolation method which was applied for estimating the cases of marriage and divorce for the next year and comparison between Rusafa and Karkh by using the MATLAB program.
This paper is concerned with combining two different transforms to present a new joint transform FHET and its inverse transform IFHET. Also, the most important property of FHET was concluded and proved, which is called the finite Hankel – Elzaki transforms of the Bessel differential operator property, this property was discussed for two different boundary conditions, Dirichlet and Robin. Where the importance of this property is shown by solving axisymmetric partial differential equations and transitioning to an algebraic equation directly. Also, the joint Finite Hankel-Elzaki transform method was applied in solving a mathematical-physical problem, which is the Hotdog Problem. A steady state which does not depend on time was discussed f
... Show MoreThe acceptance sampling plans for generalized exponential distribution, when life time experiment is truncated at a pre-determined time are provided in this article. The two parameters (α, λ), (Scale parameters and Shape parameters) are estimated by LSE, WLSE and the Best Estimator’s for various samples sizes are used to find the ratio of true mean time to a pre-determined, and are used to find the smallest possible sample size required to ensure the producer’s risks, with a pre-fixed probability (1 - P*). The result of estimations and of sampling plans is provided in tables.
Key words: Generalized Exponential Distribution, Acceptance Sampling Plan, and Consumer’s and Producer Risks
... Show MoreShe noted most of the results of research conducted on the use of computers and the Internet in the areas of administrative, economic, agricultural and educational to a significant improvement in learning outcomes of these groups in the development of mechanisms of action, has increased rates of computer use and applications in various spheres of life at very fast pace so that the computer and the Internet is a vital part of any activity, whether administrative or research or the media, hence it can be said that the use of computers and the Internet in the management of public relations activities in any organization can accelerate the pace of echo positive goals and purposes of public relations, this study comes to looking at us
... Show More