Registration techniques are still considered challenging tasks to remote sensing users, especially after enormous increase in the volume of remotely sensed data being acquired by an ever-growing number of earth observation sensors. This surge in use mandates the development of accurate and robust registration procedures that can handle these data with varying geometric and radiometric properties. This paper aims to develop the traditional registration scenarios to reduce discrepancies between registered datasets in two dimensions (2D) space for remote sensing images. This is achieved by designing a computer program written in Visual Basic language following two main stages: The first stage is a traditional registration process by defining a set of control point pairs using manual selection, then comput the parameters of global affine transformation model to match them and resample the images. The second stage included matching process refinement by determining the shift value in control points (CPs) location depending on radiometric similarity measure. Then shift map technique was adjusted to adjust the process using 2nd order polynomial transformation function. This function has chosen after conducting statistical analyses, comparing between the common transformation functions (similarity, affine, projection and 2nd order polynomial). The results showed that the developed approach reduced the root mean square error (RMSE) of registration process and decreasing the discrepancies between registered datasets with 60%, 57% and 48% respectively for each one of the three tested datasets.
The vast majority of EC applications are the web-based deployed in 3-tire Server-Client environment, the data within such application often resides within several heterogeneous data sources. Building a single application that can access each data sources can be a matter of challenging; this paper concerns with developing a software program that runs transparently against heterogeneous environment for an EC-application.
Financial fraud remains an ever-increasing problem in the financial industry with numerous consequences. The detection of fraudulent online transactions via credit cards has always been done using data mining (DM) techniques. However, fraud detection on credit card transactions (CCTs), which on its own, is a DM problem, has become a serious challenge because of two major reasons, (i) the frequent changes in the pattern of normal and fraudulent online activities, and (ii) the skewed nature of credit card fraud datasets. The detection of fraudulent CCTs mainly depends on the data sampling approach. This paper proposes a combined SVM- MPSO-MMPSO technique for credit card fraud detection. The dataset of CCTs which co
... Show MoreNowad ays, with the development of internet communication that provides many facilities to the user leads in turn to growing unauthorized access. As a result, intrusion detection system (IDS) becomes necessary to provide a high level of security for huge amount of information transferred in the network to protect them from threats. One of the main challenges for IDS is the high dimensionality of the feature space and how the relevant features to distinguish the normal network traffic from attack network are selected. In this paper, multi-objective evolutionary algorithm with decomposition (MOEA/D) and MOEA/D with the injection of a proposed local search operator are adopted to solve the Multi-objective optimization (MOO) followed by Naï
... Show MoreIn this paper Hermite interpolation method is used for solving linear and non-linear second order singular multi point boundary value problems with nonlocal condition. The approximate solution is found in the form of a rapidly convergent polynomial. We discuss behavior of the solution in the neighborhood of the singularity point which appears to perform satisfactorily for singular problems. The examples to demonstrate the applicability and efficiency of the method have been given.
The theory of Multi-Criteria Decision Making (MCDM) was introduced in the second half of the twentieth century and aids the decision maker to resolve problems when interacting criteria are involved and need to be evaluated. In this paper, we apply MCDM on the problem of the best drug for rheumatoid arthritis disease. Then, we solve the MCDM problem via -Sugeno measure and the Choquet integral to provide realistic values in the process of selecting the most appropriate drug. The approach confirms the proper interpretation of multi-criteria decision making in the drug ranking for rheumatoid arthritis.
In real world, almost all networks evolve over time. For example, in networks of friendships and acquaintances, people continually create and delete friendship relationship connections over time, thereby add and draw friends, and some people become part of new social networks or leave their networks, changing the nodes in the network. Recently, tracking communities encountering topological shifting drawn significant attentions and many successive algorithms have been proposed to model the problem. In general, evolutionary clustering can be defined as clustering data over time wherein two concepts: snapshot quality and temporal smoothness should be considered. Snapshot quality means that the clusters should be as precise as possible durin
... Show MoreBackground: Incorporation of chemical additives has long been a technique used to improve properties of the gypsum products. The purpose of this work was to study the effects of adding a combination of gum Arabic and calcium hydroxide to a type III dental stone and type IV improved die stone with different proportion. The effect on water/powder ratio, and surface hardness was determined. Material and method: Both material stone and die stone were blended with two proportion of additives so that each material was mixed twice but with different proportion of gum Arabic (0.1% and 0.2%) and calcium hydroxide (0.5 % and 0.3%). Data for hardness were subjected to two-way analysis of variance. Results: The results revealed that the chemical additi
... Show MoreIn this research The study of Multi-level model (partial pooling model) we consider The partial pooling model which is one Multi-level models and one of the Most important models and extensive use and application in the analysis of the data .This Model characterized by the fact that the treatments take hierarchical or structural Form, in this partial pooling models, Full Maximum likelihood FML was used to estimated parameters of partial pooling models (fixed and random ), comparison between the preference of these Models, The application was on the Suspended Dust data in Iraq, The data were for four and a half years .Eight stations were selected randomly among the stations in Iraq. We use Akaik′s Informa
... Show More