Optimization is the task of minimizing or maximizing an objective function f(x) parameterized by x. A series of effective numerical optimization methods have become popular for improving the performance and efficiency of other methods characterized by high-quality solutions and high convergence speed. In recent years, there are a lot of interest in hybrid metaheuristics, where more than one method is ideally combined into one new method that has the ability to solve many problems rapidly and efficiently. The basic concept of the proposed method is based on the addition of the acceleration part of the Gravity Search Algorithm (GSA) model in the Firefly Algorithm (FA) model and creating new individuals. Some standard objective functions are used to compare the hybrid (FAGSA) method with FA and the traditional GSA to find the optimal solution. Simulation results obtained by MATLAB R2015a indicate that the hybrid algorithm has the ability to cross the local optimum limits with a faster convergence than the luminous Fireflies algorithm and the ordinary gravity search algorithm. Therefore, this paper proposes a new numerical optimization method based on integrating the properties of the two methods (luminous fireflies and gravity research). In most cases, the proposed method usually gives better results than the original methods individually.
Many fuzzy clustering are based on within-cluster scatter with a compactness measure , but in this paper explaining new fuzzy clustering method which depend on within-cluster scatter with a compactness measure and between-cluster scatter with a separation measure called the fuzzy compactness and separation (FCS). The fuzzy linear discriminant analysis (FLDA) based on within-cluster scatter matrix and between-cluster scatter matrix . Then two fuzzy scattering matrices in the objective function assure the compactness between data elements and cluster centers .To test the optimal number of clusters using validation clustering method is discuss .After that an illustrate example are applied.
In this article, the inverse source problem is determined by the partition hyperbolic equation under the left end flux tension of the string, where the extra measurement is considered. The approximate solution is obtained in the form of splitting and applying the finite difference method (FDM). Moreover, this problem is ill-posed, dealing with instability of force after adding noise to the additional condition. To stabilize the solution, the regularization matrix is considered. Consequently, it is proved by error estimates between the regularized solution and the exact solution. The numerical results show that the method is efficient and stable.
Calcium-Montmorillonite (bentonite) [Ca-MMT] has been prepared via cation exchange reaction using benzalkonium chloride [quaternary ammonium] as a surfactant to produce organoclay which is used to prepare polymer composites. Functionalization of this filler surface is very important factor for achieving good interaction between filler and polymer matrix. Basal spacing and functional groups identification of this organoclay were characterized using X-Ray Diffraction (XRD) and Fourier Transform Infrared (FTIR) spectroscopy respectively. The (XRD) results showed that the basal spacing of the treated clay (organoclay) with the benzalkonium chloride increased to 15.17213 0A, this represents an increment of about 77.9% in the
... Show MoreA seemingly uncorrelated regression (SUR) model is a special case of multivariate models, in which the error terms in these equations are contemporaneously related. The method estimator (GLS) is efficient because it takes into account the covariance structure of errors, but it is also very sensitive to outliers. The robust SUR estimator can dealing outliers. We propose two robust methods for calculating the estimator, which are (S-Estimations, and FastSUR). We find that it significantly improved the quality of SUR model estimates. In addition, the results gave the FastSUR method superiority over the S method in dealing with outliers contained in the data set, as it has lower (MSE and RMSE) and higher (R-Squared and R-Square Adjus
... Show MoreThe Exponentiated Lomax Distribution is considered one of the most commonly used continuous distribution which has a major role in analysing and modelling life time data. Therefore, A family was formed for the Exponential Lomax Distribution by introducing two new distributions as special case of the Exponentiated Lomax Distribution: (Modified Exponentiated Lomax Distribution (MELD) and Restricted Exponentiated Lomax Distribution (RELD. Furthermore, to assess the usefulness and flexibility, the two distributions were applied upon simulation study besides real application with real data set. The simulation results clearly shown the flexible performance of the maximum likelihood estimators for the parameter. Also, the real applicat
... Show MoreBackground: Chronic suppurative otitis media (CSOM) is the result of aninitial episode of acute otitis media and is characterized by a persistent discharge from the middle ear through a tympanic perforation for at least 2 weeks duration. It is an important cause of preventable hearing loss, particularly in the developing world.Methods. 1. To get an overview on the bacterial ear infection profile in general 2. To assess the antibiotic resistance of Pseudomonal infection (PS) particularly since it is usually the commonest infection to cause otitis media and the most difficult to treat due to the problem of multi drug resistance... A cross sectional study was done which included 405 patient of CSOM patients196 (48%) case were males ,209 (52
... Show MoreAverage per capita GDP income is an important economic indicator. Economists use this term to determine the amount of progress or decline in the country's economy. It is also used to determine the order of countries and compare them with each other. Average per capita GDP income was first studied using the Time Series (Box Jenkins method), and the second is linear and non-linear regression; these methods are the most important and most commonly used statistical methods for forecasting because they are flexible and accurate in practice. The comparison is made to determine the best method between the two methods mentioned above using specific statistical criteria. The research found that the best approach is to build a model for predi
... Show MoreMagnesium oxide nanoparticles (MgO NPs) were synthesized by a green method using the peels of Persimmon extract as the reducing agent , magnesium nitrate, and NaOH. This method is eco-friendly and non-toxic. In this study, an ultrasound device was used to reduce the particle size, with the impact on the energy gap was set at the beginning at 5.39 eV and then turned to 4.10 eV. The morphological analysis using atomic force microscopy (AFM) showed that the grain size for MgO NPs was 67.70 nm which became 42.33 nm after the use of the ultrasound. The shape of the particles was almost spherical and became cylindrical. In addition the Field-Emission Scanning Electron Microscopy (FESEM) analysis sh
... Show MoreThe theoretical analysis depends on the Classical Laminated Plate Theory (CLPT) that is based on the Von-K ráman Theory and Kirchhov Hypothesis in the deflection analysis during elastic limit as well as the Hooke's laws of calculation the stresses. New function for boundary condition is used to solve the forth degree of differential equations which depends on variety sources of advanced engineering mathematics. The behavior of composite laminated plates, symmetric and anti-symmetric of cross-ply angle, under out-of-plane loads (uniform distributed loads) with two different boundary conditions are investigated to obtain the central deflection for mid-plane by using the Ritz method. The computer programs is built using Ma
... Show MoreAnalysis of variance (ANOVA) is one of the most widely used methods in statistics to analyze the behavior of one variable compared to another. The data were collected from a sample size of 65 adult males who were nonsmokers, light smokers, or heavy smokers. The aim of this study is to analyze the effects of cigarette smoking on high-density lipoprotein cholesterol (HDL-C) level and determine whether smoking causes a reduction in this level, by using the completely randomized design (CRD) and Kruskal- Wallis method. The results showed that the assumptions of the one- way ANOVA are not satisfied, while, after transforming original data by using log transformation, they are satisfied. From the results, a significantly
... Show More