Enhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contrast value because of the added edge points from the two combined images that depend on the suggested algorithms. This enhancement in edge regions is measured and reaches to double in enhancing the contrast. Different methods are used to be compared with the suggested method.
Cloud computing is a pay-as-you-go model that provides users with on-demand access to services or computing resources. It is a challenging issue to maximize the service provider's profit and, on the other hand, meet the Quality of Service (QoS) requirements of users. Therefore, this paper proposes an admission control heuristic (ACH) approach that selects or rejects the requests based on budget, deadline, and penalty cost, i.e., those given by the user. Then a service level agreement (SLA) is created for each selected request. The proposed work uses Particle Swarm Optimization (PSO) and the Salp Swarm Algorithm (SSA) to schedule the selected requests under budget and deadline constraints. Performances of PSO and SSA with and witho
... Show MoreEvolutionary algorithms (EAs), as global search methods, are proved to be more robust than their counterpart local heuristics for detecting protein complexes in protein-protein interaction (PPI) networks. Typically, the source of robustness of these EAs comes from their components and parameters. These components are solution representation, selection, crossover, and mutation. Unfortunately, almost all EA based complex detection methods suggested in the literature were designed with only canonical or traditional components. Further, topological structure of the protein network is the main information that is used in the design of almost all such components. The main contribution of this paper is to formulate a more robust EA wit
... Show MoreIn this paper, the propose is to use the xtreme value distribution as the rate of occurrence of the non-homogenous Poisson process, in order to improve the rate of occurrence of the non-homogenous process, which has been called the Extreme value Process. To estimate the parameters of this process, it is proposed to use the Maximum Likelihood method, Method of Moment and a smart method represented by the Artificial Bee Colony:(ABC) algorithm to reach an estimator for this process which represents the best data representation. The results of the three methods are compared through a simulation of the model, and it is concluded that the estimator of (ABC) is better than the estimator of the maximum likelihood method and method of mo
... Show MoreThe modern systems that have been based upon the hash function are more suitable compared to the conventional systems; however, the complicated algorithms for the generation of the invertible functions have a high level of time consumption. With the use of the GAs, the key strength is enhanced, which results in ultimately making the entire algorithm sufficient. Initially, the process of the key generation is performed by using the results of n-queen problem that is solved by the genetic algorithm, with the use of a random number generator and through the application of the GA operations. Ultimately, the encryption of the data is performed with the use of the Modified Reverse Encryption Algorithm (MREA). It was noticed that the
... Show MorePeople’s ability to quickly convey their thoughts, or opinions, on various services or items has improved as Web 2.0 has evolved. This is to look at the public perceptions expressed in the reviews. Aspect-based sentiment analysis (ABSA) deemed to receive a set of texts (e.g., product reviews or online reviews) and identify the opinion-target (aspect) within each review. Contemporary aspect-based sentiment analysis systems, like the aspect categorization, rely predominantly on lexicon-based, or manually labelled seeds that is being incorporated into the topic models. And using either handcrafted rules or pre-labelled clues for performing implicit aspect detection. These constraints are restricted to a particular domain or language which is
... Show MoreThe current issues in spam email detection systems are directly related to spam email classification's low accuracy and feature selection's high dimensionality. However, in machine learning (ML), feature selection (FS) as a global optimization strategy reduces data redundancy and produces a collection of precise and acceptable outcomes. A black hole algorithm-based FS algorithm is suggested in this paper for reducing the dimensionality of features and improving the accuracy of spam email classification. Each star's features are represented in binary form, with the features being transformed to binary using a sigmoid function. The proposed Binary Black Hole Algorithm (BBH) searches the feature space for the best feature subsets,
... Show MoreIn the last few years, the Internet of Things (IoT) is gaining remarkable attention in both academic and industrial worlds. The main goal of the IoT is laying on describing everyday objects with different capabilities in an interconnected fashion to the Internet to share resources and to carry out the assigned tasks. Most of the IoT objects are heterogeneous in terms of the amount of energy, processing ability, memory storage, etc. However, one of the most important challenges facing the IoT networks is the energy-efficient task allocation. An efficient task allocation protocol in the IoT network should ensure the fair and efficient distribution of resources for all objects to collaborate dynamically with limited energy. The canonical de
... Show MoreThis work presents a comparison between the Convolutional Encoding CE, Parallel Turbo code and Low density Parity Check (LDPC) coding schemes with a MultiUser Single Output MUSO Multi-Carrier Code Division Multiple Access (MC-CDMA) system over multipath fading channels. The decoding technique used in the simulation was iterative decoding since it gives maximum efficiency at higher iterations. Modulation schemes used is Quadrature Amplitude Modulation QAM. An 8 pilot carrier were
used to compensate channel effect with Least Square Estimation method. The channel model used is Long Term Evolution (LTE) channel with Technical Specification TS 25.101v2.10 and 5 MHz bandwidth bandwidth including the channels of indoor to outdoor/ pedestrian
Nonlinear regression models are important tools for solving optimization problems. As traditional techniques would fail to reach satisfactory solutions for the parameter estimation problem. Hence, in this paper, the BAT algorithm to estimate the parameters of Nonlinear Regression models is used . The simulation study is considered to investigate the performance of the proposed algorithm with the maximum likelihood (MLE) and Least square (LS) methods. The results show that the Bat algorithm provides accurate estimation and it is satisfactory for the parameter estimation of the nonlinear regression models than MLE and LS methods depend on Mean Square error.
Association rules mining (ARM) is a fundamental and widely used data mining technique to achieve useful information about data. The traditional ARM algorithms are degrading computation efficiency by mining too many association rules which are not appropriate for a given user. Recent research in (ARM) is investigating the use of metaheuristic algorithms which are looking for only a subset of high-quality rules. In this paper, a modified discrete cuckoo search algorithm for association rules mining DCS-ARM is proposed for this purpose. The effectiveness of our algorithm is tested against a set of well-known transactional databases. Results indicate that the proposed algorithm outperforms the existing metaheuristic methods.