Enhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contrast value because of the added edge points from the two combined images that depend on the suggested algorithms. This enhancement in edge regions is measured and reaches to double in enhancing the contrast. Different methods are used to be compared with the suggested method.
In this paper, the generalized inverted exponential distribution is considered as one of the most important distributions in studying failure times. A shape and scale parameters of the distribution have been estimated after removing the fuzziness that characterizes its data because they are triangular fuzzy numbers. To convert the fuzzy data to crisp data the researcher has used the centroid method. Hence the studied distribution has two parameters which show a difficulty in separating and estimating them directly of the MLE method. The Newton-Raphson method has been used.
... Show MoreAngle of arrival (AOA) estimation for wideband signal becomes more necessary for modern communication systems like Global System for Mobile (GSM), satellite, military applications and spread spectrum (frequency hopping and direct sequence). Most of the researchers are focusing on how to cancel the effects of signal bandwidth on AOA estimation performance by using a transversal filter (tap delay line) (TDL). Most of the researchers were using two elements array antenna to study these effects. In this research, a general case of proposed (M) array elements is used. A transversal filter (TDL) in phase adaptive array antenna system is used to calculate the optimum number of taps required to compensate these effect. The propo
... Show MoreABSTRICT:
This study is concerned with the estimation of constant and time-varying parameters in non-linear ordinary differential equations, which do not have analytical solutions. The estimation is done in a multi-stage method where constant and time-varying parameters are estimated in a straight sequential way from several stages. In the first stage, the model of the differential equations is converted to a regression model that includes the state variables with their derivatives and then the estimation of the state variables and their derivatives in a penalized splines method and compensating the estimations in the regression model. In the second stage, the pseudo- least squares method was used to es
... Show MoreAbstract
The aim of this research is to concentrate on the of knowledge management activities, initial activities: (Acquisition, Selection, Generation, Assimilation, Emission) knowledge, and support activities: (Measurement, Control, Coordination, Leadership) that is manipulate and controlling in achieving knowledge management cases in organization, that’s is leads to knowledge chain model, then determining the level of membership for these activities to knowledge chain model in a sample of Iraqi organization pushed by knowledge (Universities). The research depends on check list for gaining the data required, theses check list designed by apparently in diagnosing research dimensions and measurem
... Show MoreIn this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
Essential approaches involving photons are among the most common uses of parallel optical computation due to their recent invention, ease of production, and low cost. As a result, most researchers have concentrated their efforts on it. The Basic Arithmetic Unit BAU is built using a three-step approach that uses optical gates with three states to configure the circuitry for addition, subtraction, and multiplication. This is a new optical computing method based on the usage of a radix of (2): a binary number with a signed-digit (BSD) system that includes the numbers -1, 0, and 1. Light with horizontal polarization (LHP) (↔), light with no intensity (LNI) (⥀), and light with vertical polarization (LVP) (↨) is represen
... Show MoreThese search summaries in building a mathematical model to the issue of Integer linear Fractional programming and finding the best solution of Integer linear Fractional programming (I.L.F.P) that maximize the productivity of the company,s revenue by using the largest possible number of production units and maximizing denominator objective which represents,s proportion of profits to the costs, thus maximizing total profit of the company at the lowest cost through using Dinkelbach algorithm and the complementary method on the Light industries company data for 2013 and comparing results with Goal programming methods results.
It is clear that the final results of resolution and Dinkelbac
... Show MoreDiscussion dealt with the independent factors critical such us success factors and the risk management process, and dependent factor of the general competitive strategies, and began searching the dilemma of thought, as crystallized his problem in the light of the need for organizations to philosophy and deeper vision of a more comprehensive understanding of the concept of risk management, assessment and management to maximize the competitive strategies of public, and on this basis, Search queries formulated problem of the gap between the knowledge-based intellectual propositions farcical for the purposes of interpretation of the relationship between the critical success factors and the risk
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show More