Fuzzy C-means (FCM) is a clustering method used for collecting similar data elements within the group according to specific measurements. Tabu is a heuristic algorithm. In this paper, Probabilistic Tabu Search for FCM implemented to find a global clustering based on the minimum value of the Fuzzy objective function. The experiments designed for different networks, and cluster’s number the results show the best performance based on the comparison that is done between the values of the objective function in the case of using standard FCM and Tabu-FCM, for the average of ten runs.
End of the twentieth century witnessed by the technological evolution Convergences between the visual arts aesthetic value and objective representation of the image in the composition of the design of the fabric of new insights and unconventional potential in atypical employment. It is through access to the designs of modern fabrics that address the employment picture footage included several scenes footage from the film, which focuses on research and analytical as a study to demonstrate the elements of the picture and the organization of its rules and how to functioning in the design of fabrics, Thus, it has identified the problem by asking the following: What are the elements of the picture footage and how the functioning of the struct
... Show MoreAt the last two decades , The environment has witnessed tremendous changes in many fields with the huge competition , various technological development and customer satisfaction , that are reflected in economic units a doption for lean production system .
Lean Accounting that has appeared as aresponse for changes occurred of economic units adoption for lean accounting system instead of wide production system : through it management of economic units has been changed from management by top departments into management by value flows : has provide new method for accounting costs according to value flow
... Show MoreRecently, the development of the field of biomedical engineering has led to a renewed interest in detection of several events. In this paper a new approach used to detect specific parameter and relations between three biomedical signals that used in clinical diagnosis. These include the phonocardiography (PCG), electrocardiography (ECG) and photoplethysmography (PPG) or sometimes it called the carotid pulse related to the position of electrode.
Comparisons between three cases (two normal cases and one abnormal case) are used to indicate the delay that may occurred due to the deficiency of the cardiac muscle or valve in an abnormal case.
The results shown that S1 and S2, first and second sound of the
... Show MoreThe nature of the dark sector of the Universe remains one of the outstanding problems in modern cosmology, with the search for new observational probes guiding the development of the next generation of observational facilities. Clues come from tension between the predictions from Λ cold dark matter (ΛCDM) and observations of gravitationally lensed galaxies. Previous studies showed that galaxy clusters in the ΛCDM are not strong enough to reproduce the observed number of lensed arcs. This work aims to constrain the warm dark matter (WDM) cosmologies by means of the lensing efficiency of galaxy clusters drawn from these alternative models. The lensing characteristics of two samples of simulated clusters in the Λ warm dark matter and ΛCDM
... Show MoreAbstract
This research aims to identify the role of Psychological Capital (PsyCap) in the Spirituality at the Workplace (SAW) for a sample of the teaching staff of the four Colleges of the University of Kufa reached (200) out of (470) teaching, and to achieve the objective of this research and through access to research and studies of foreign adopted researchers standards scales of research variables, since it relied on the model (Luthans, Youssef, et al., 2007) to represent the components of Psychological Capital (self-efficacy, and hope, and optimism, and resilience), and given the attention organizations in the human element because of it
... Show MoreNonlinear regression models are important tools for solving optimization problems. As traditional techniques would fail to reach satisfactory solutions for the parameter estimation problem. Hence, in this paper, the BAT algorithm to estimate the parameters of Nonlinear Regression models is used . The simulation study is considered to investigate the performance of the proposed algorithm with the maximum likelihood (MLE) and Least square (LS) methods. The results show that the Bat algorithm provides accurate estimation and it is satisfactory for the parameter estimation of the nonlinear regression models than MLE and LS methods depend on Mean Square error.
A multivariate multisite hydrological data forecasting model was derived and checked using a case study. The philosophy is to use simultaneously the cross-variable correlations, cross-site correlations and the time lag correlations. The case study is of two variables, three sites, the variables are the monthly rainfall and evaporation; the sites are Sulaimania, Dokan, and Darbandikhan.. The model form is similar to the first order auto regressive model, but in matrices form. A matrix for the different relative correlations mentioned above and another for their relative residuals were derived and used as the model parameters. A mathematical filter was used for both matrices to obtain the elements. The application of this model indicates i
... Show MoreFor a given loading, the stiffness of a plate or shell structure can be increased significantly by the addition of ribs or stiffeners. Hitherto, the optimization techniques are mainly on the sizing of the ribs. The more important issue of identifying the optimum location of the ribs has received little attention. In this investigation, finite element analysis has been achieved for the determination of the optimum locations of the ribs for a given set of design constraints. In the conclusion, the author underlines the optimum positions of the ribs or stiffeners which give the best results.
Extractive multi-document text summarization – a summarization with the aim of removing redundant information in a document collection while preserving its salient sentences – has recently enjoyed a large interest in proposing automatic models. This paper proposes an extractive multi-document text summarization model based on genetic algorithm (GA). First, the problem is modeled as a discrete optimization problem and a specific fitness function is designed to effectively cope with the proposed model. Then, a binary-encoded representation together with a heuristic mutation and a local repair operators are proposed to characterize the adopted GA. Experiments are applied to ten topics from Document Understanding Conference DUC2002 datas
... Show More