Enhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contrast value because of the added edge points from the two combined images that depend on the suggested algorithms. This enhancement in edge regions is measured and reaches to double in enhancing the contrast. Different methods are used to be compared with the suggested method.
ABSTRICT:
This study is concerned with the estimation of constant and time-varying parameters in non-linear ordinary differential equations, which do not have analytical solutions. The estimation is done in a multi-stage method where constant and time-varying parameters are estimated in a straight sequential way from several stages. In the first stage, the model of the differential equations is converted to a regression model that includes the state variables with their derivatives and then the estimation of the state variables and their derivatives in a penalized splines method and compensating the estimations in the regression model. In the second stage, the pseudo- least squares method was used to es
... Show MoreAbstract
The aim of this research is to concentrate on the of knowledge management activities, initial activities: (Acquisition, Selection, Generation, Assimilation, Emission) knowledge, and support activities: (Measurement, Control, Coordination, Leadership) that is manipulate and controlling in achieving knowledge management cases in organization, that’s is leads to knowledge chain model, then determining the level of membership for these activities to knowledge chain model in a sample of Iraqi organization pushed by knowledge (Universities). The research depends on check list for gaining the data required, theses check list designed by apparently in diagnosing research dimensions and measurem
... Show MoreEssential approaches involving photons are among the most common uses of parallel optical computation due to their recent invention, ease of production, and low cost. As a result, most researchers have concentrated their efforts on it. The Basic Arithmetic Unit BAU is built using a three-step approach that uses optical gates with three states to configure the circuitry for addition, subtraction, and multiplication. This is a new optical computing method based on the usage of a radix of (2): a binary number with a signed-digit (BSD) system that includes the numbers -1, 0, and 1. Light with horizontal polarization (LHP) (↔), light with no intensity (LNI) (⥀), and light with vertical polarization (LVP) (↨) is represen
... Show MoreIn this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
These search summaries in building a mathematical model to the issue of Integer linear Fractional programming and finding the best solution of Integer linear Fractional programming (I.L.F.P) that maximize the productivity of the company,s revenue by using the largest possible number of production units and maximizing denominator objective which represents,s proportion of profits to the costs, thus maximizing total profit of the company at the lowest cost through using Dinkelbach algorithm and the complementary method on the Light industries company data for 2013 and comparing results with Goal programming methods results.
It is clear that the final results of resolution and Dinkelbac
... Show MoreLoad balancing in computer networks is one of the most subjects that has got researcher's attention in the last decade. Load balancing will lead to reduce processing time and memory usage that are the most two concerns of the network companies in now days, and they are the most two factors that determine if the approach is worthy applicable or not. There are two kinds of load balancing, distributing jobs among other servers before processing starts and stays at that server to the end of the process is called static load balancing, and moving jobs during processing is called dynamic load balancing. In this research, two algorithms are designed and implemented, the History Usage (HU) algorithm that statically balances the load of a Loaded
... Show MoreDiscussion dealt with the independent factors critical such us success factors and the risk management process, and dependent factor of the general competitive strategies, and began searching the dilemma of thought, as crystallized his problem in the light of the need for organizations to philosophy and deeper vision of a more comprehensive understanding of the concept of risk management, assessment and management to maximize the competitive strategies of public, and on this basis, Search queries formulated problem of the gap between the knowledge-based intellectual propositions farcical for the purposes of interpretation of the relationship between the critical success factors and the risk
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show More