Fuzzy C-means (FCM) is a clustering method used for collecting similar data elements within the group according to specific measurements. Tabu is a heuristic algorithm. In this paper, Probabilistic Tabu Search for FCM implemented to find a global clustering based on the minimum value of the Fuzzy objective function. The experiments designed for different networks, and cluster’s number the results show the best performance based on the comparison that is done between the values of the objective function in the case of using standard FCM and Tabu-FCM, for the average of ten runs.
In this paper, a comparison between horizontal and vertical OFET of Poly (3-Hexylthiophene) (P3HT) as an active semiconductor layer (p-type) was studied by using two different gate insulators (ZrO2 and PVA). The electrical performance output (Id-Vd) and transfer (Id-Vg) characteristics were investigated using the gradual-channel approximation model. The device shows a typical output curve of a field-effect transistor (FET). The analysis of electrical characterization was performed in order to investigate the source-drain voltage (Vd) dependent current and the effects of gate dielectric on the electrical performance of the OFET. This work also considered the effects of the
... Show MoreThis study aims to assess the water quality index (WQI) according to the Canadian Council of Ministers of the Environment's Water Quality Index method (CCME WQI). Four locations (measurement stations) are selected along the Tigris River, in Iraq. Two of them are located in the north near Mosul City, (Mosul Dam and Mosul city), and the other two are located in the south near Al-Amarah city, (Ali Garbi and Al-Amarah). The water data collected is for the period 2011 to 2013, including eleven water quality parameters. These are magnesium (Mg+2), calcium (Ca2+), potassium (K+), sodium (Na+), sulfate (SO42-), chloride (Cl-), nitrate (NO3<
... Show Morethis research aims at a number of objectives including Developing the tax examination process and raise its efficiency without relying on comprehensive examination method using some statistical methods in the tax examination and Discussing the most important concepts related to the statistical methods used in the tax examination and showing its importance and how they are applied. the research represents an applied study in the General Commission of taxes. In order to achieve its objectives the research has used in the theoretical side the descriptive approach (analytical), and in the practical side Some statistical methods applied to the sample of the final accounts for the contracting company (limited) and the pharmaceutical industry (
... Show MoreIn this paper, a comparison between horizontal and vertical OFET of Poly (3-Hexylthiophene) (P3HT) as an active semiconductor layer (p-type) was studied by using two different gate insulators (ZrO2 and PVA). The electrical performance output (Id-Vd) and transfer (Id-Vg) characteristics were investigated using the gradual-channel approximation model. The device shows a typical output curve of a field-effect transistor (FET). The analysis of electrical characterization was performed in order to investigate the source-drain voltage (Vd) dependent current and the effects of gate dielectric on the electrical performance of the OFET. This work also considered the effects of the capacitance semiconductor on the performance OFETs. The value
... Show MoreThe aim of this work, is to study color filming by using different intensities of fluorescent light, where we evaluate the capture image qualities for the RGB bands and component of L. And we study the relation between the means of RGBL values of the images as a function of the power of fluorescent light circuit . From the results, we show that the mean μ increases rapidly at low power values, then it will reach the stability at high power values.
In this paper, the propose is to use the xtreme value distribution as the rate of occurrence of the non-homogenous Poisson process, in order to improve the rate of occurrence of the non-homogenous process, which has been called the Extreme value Process. To estimate the parameters of this process, it is proposed to use the Maximum Likelihood method, Method of Moment and a smart method represented by the Artificial Bee Colony:(ABC) algorithm to reach an estimator for this process which represents the best data representation. The results of the three methods are compared through a simulation of the model, and it is concluded that the estimator of (ABC) is better than the estimator of the maximum likelihood method and method of mo
... Show MoreAbstractThe objective of the present study was measured of several oxidative stresses and liver function parameters in workers occupationally exposed to cement dust in Kufa Cement Factory, in order to test the hypothesis that cement dust exposure may perturb these parameters. Assessment of oxidative stress and liver function parameters were performed in 63 workers occupationally, in different departments of Kufa Cement Factory, exposed to cement dust (range of the exposure time was 5-38 years) and 36 matched unexposed controls. The study results illustrated an increasing in the oxidative stress parameters, moreover; liver function parameters showed abnormal results in the exposed workers compared to the unexposed. An increase in theses para
... Show MoreUsing watermarking techniques and digital signatures can better solve the problems of digital images transmitted on the Internet like forgery, tampering, altering, etc. In this paper we proposed invisible fragile watermark and MD-5 based algorithm for digital image authenticating and tampers detecting in the Discrete Wavelet Transform DWT domain. The digital image is decomposed using 2-level DWT and the middle and high frequency sub-bands are used for watermark and digital signature embedding. The authentication data are embedded in number of the coefficients of these sub-bands according to the adaptive threshold based on the watermark length and the coefficients of each DWT level. These sub-bands are used because they a
... Show MoreThe Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreThis paper is interested in certain subclasses of univalent and bi-univalent functions concerning to shell- like curves connected with k-Fibonacci numbers involving modified Sigmoid activation function θ(t)=2/(1+e^(-t) ) ,t ≥0 in unit disk |z|<1 . For estimating of the initial coefficients |c_2 | , |c_3 |, Fekete-Szego ̈ inequality and the second Hankel determinant have been investigated for the functions in our classes.