Enhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contrast value because of the added edge points from the two combined images that depend on the suggested algorithms. This enhancement in edge regions is measured and reaches to double in enhancing the contrast. Different methods are used to be compared with the suggested method.
Abstract
Suffering the human because of pressure normal life of exposure to several types of heart disease as a result of due to different factors. Therefore, and in order to find out the case of a death whether or not, are to be modeled using binary logistic regression model
In this research used, one of the most important models of nonlinear regression models extensive use in the modeling of applications statistical, in terms of heart disease which is the binary logistic regression model. and then estimating the parameters of this model using the statistical estimation methods, another problem will be appears in estimating its parameters, as well as when the numbe
... Show MoreIn the present work, an image compression method have been modified by combining The Absolute Moment Block Truncation Coding algorithm (AMBTC) with a VQ-based image coding. At the beginning, the AMBTC algorithm based on Weber's law condition have been used to distinguish low and high detail blocks in the original image. The coder will transmit only mean of low detailed block (i.e. uniform blocks like background) on the channel instate of transmit the two reconstruction mean values and bit map for this block. While the high detail block is coded by the proposed fast encoding algorithm for vector quantized method based on the Triangular Inequality Theorem (TIE), then the coder will transmit the two reconstruction mean values (i.e. H&L)
... Show MoreThis paper investigates some exact and local search methods to solve the traveling salesman problem. The Branch and Bound technique (BABT) is proposed, as an exact method, with two models. In addition, the classical Genetic Algorithm (GA) and Simulated Annealing (SA) are discussed and applied as local search methods. To improve the performance of GA we propose two kinds of improvements for GA; the first is called improved GA (IGA) and the second is Hybrid GA (HGA).
The IGA gives best results than GA and SA, while the HGA is the best local search method for all within a reasonable time for 5 ≤ n ≤ 2000, where n is the number of visited cities. An effective method of reducing the size of the TSP matrix was proposed with
... Show MoreToday the Genetic Algorithm (GA) tops all the standard algorithms in solving complex nonlinear equations based on the laws of nature. However, permute convergence is considered one of the most significant drawbacks of GA, which is known as increasing the number of iterations needed to achieve a global optimum. To address this shortcoming, this paper proposes a new GA based on chaotic systems. In GA processes, we use the logistic map and the Linear Feedback Shift Register (LFSR) to generate chaotic values to use instead of each step requiring random values. The Chaos Genetic Algorithm (CGA) avoids local convergence more frequently than the traditional GA due to its diversity. The concept is using chaotic sequences with LFSR to gene
... Show MoreSome maps of the chaotic firefly algorithm were selected to select variables for data on blood diseases and blood vessels obtained from Nasiriyah General Hospital where the data were tested and tracking the distribution of Gamma and it was concluded that a Chebyshevmap method is more efficient than a Sinusoidal map method through mean square error criterion.
In this paper, the generalized inverted exponential distribution is considered as one of the most important distributions in studying failure times. A shape and scale parameters of the distribution have been estimated after removing the fuzziness that characterizes its data because they are triangular fuzzy numbers. To convert the fuzzy data to crisp data the researcher has used the centroid method. Hence the studied distribution has two parameters which show a difficulty in separating and estimating them directly of the MLE method. The Newton-Raphson method has been used.
... Show MoreAngle of arrival (AOA) estimation for wideband signal becomes more necessary for modern communication systems like Global System for Mobile (GSM), satellite, military applications and spread spectrum (frequency hopping and direct sequence). Most of the researchers are focusing on how to cancel the effects of signal bandwidth on AOA estimation performance by using a transversal filter (tap delay line) (TDL). Most of the researchers were using two elements array antenna to study these effects. In this research, a general case of proposed (M) array elements is used. A transversal filter (TDL) in phase adaptive array antenna system is used to calculate the optimum number of taps required to compensate these effect. The propo
... Show MoreMulti-document summarization is an optimization problem demanding optimization of more than one objective function simultaneously. The proposed work regards balancing of the two significant objectives: content coverage and diversity when generating summaries from a collection of text documents.
Any automatic text summarization system has the challenge of producing high quality summary. Despite the existing efforts on designing and evaluating the performance of many text summarization techniques, their formulations lack the introduction of any model that can give an explicit representation of – coverage and diversity – the two contradictory semantics of any summary. In this work, the design of
... Show MoreIn this paper, we design a fuzzy neural network to solve fuzzy singularly perturbed Volterra integro-differential equation by using a High Performance Training Algorithm such as the Levenberge-Marqaurdt (TrianLM) and the sigmoid function of the hidden units which is the hyperbolic tangent activation function. A fuzzy trial solution to fuzzy singularly perturbed Volterra integro-differential equation is written as a sum of two components. The first component meets the fuzzy requirements, however, it does not have any fuzzy adjustable parameters. The second component is a feed-forward fuzzy neural network with fuzzy adjustable parameters. The proposed method is compared with the analytical solutions. We find that the proposed meth
... Show MoreAutomatic document summarization technology is evolving and may offer a solution to the problem of information overload. Multi-document summarization is an optimization problem demanding optimizing more than one objective function concurrently. The proposed work considers a balance of two significant objectives: content coverage and diversity while generating a summary from a collection of text documents. Despite the large efforts introduced from several researchers for designing and evaluating performance of many text summarization techniques, their formulations lack the introduction of any model that can give an explicit representation of – coverage and diversity – the two contradictory semantics of any summary. The design of gener
... Show More