For several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.
This work is divided into two parts first part study electronic structure and vibration properties of the Iobenguane material that is used in CT scan imaging. Iobenguane, or MIBG, is an aralkylguanidine analog of the adrenergic neurotransmitter norepinephrine and a radiopharmaceutical. It acts as a blocking agent for adrenergic neurons. When radiolabeled, it can be used in nuclear medicinal diagnostic techniques as well as in neuroendocrine antineoplastic treatments. The aim of this work is to provide general information about Iobenguane that can be used to obtain results to diagnose the diseases. The second part study image processing techniques, the CT scan image is transformed to frequency domain using the LWT. Two methods of contrast
... Show MoreLocalization is an essential demand in wireless sensor networks (WSNs). It relies on several types of measurements. This paper focuses on positioning in 3-D space using time-of-arrival- (TOA-) based distance measurements between the target node and a number of anchor nodes. Central localization is assumed and either RF, acoustic or UWB signals are used for distance measurements. This problem is treated by using iterative gradient descent (GD), and an iterative GD-based algorithm for localization of moving sensors in a WSN has been proposed. To localize a node in 3-D space, at least four anchors are needed. In this work, however, five anchors are used to get better accuracy. In GD localization of a moving sensor, the algo
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the
... Show MoreThe combination of wavelet theory and neural networks has lead to the development of wavelet networks. Wavelet networks are feed-forward neural networks using wavelets as activation function. Wavelets networks have been used in classification and identification problems with some success.
In this work we proposed a fuzzy wavenet network (FWN), which learns by common back-propagation algorithm to classify medical images. The library of medical image has been analyzed, first. Second, Two experimental tables’ rules provide an excellent opportunity to test the ability of fuzzy wavenet network due to the high level of information variability often experienced with this type of images.
&n
... Show MoreChurning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date. A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM s
... Show MoreA mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others
... Show MoreMetaheuristics under the swarm intelligence (SI) class have proven to be efficient and have become popular methods for solving different optimization problems. Based on the usage of memory, metaheuristics can be classified into algorithms with memory and without memory (memory-less). The absence of memory in some metaheuristics will lead to the loss of the information gained in previous iterations. The metaheuristics tend to divert from promising areas of solutions search spaces which will lead to non-optimal solutions. This paper aims to review memory usage and its effect on the performance of the main SI-based metaheuristics. Investigation has been performed on SI metaheuristics, memory usage and memory-less metaheuristics, memory char
... Show MoreThis paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show MoreMaximum power point tracking (MPPT) is used in photovoltaic (PV) systems to enhance efficiency and maximize the output power of PV module, regardless the variation of temperature, irradiation, and the electrical characteristics of the load. A new MPPT system has been presented in this research, consisting of a synchronous DC-DC step-down Buck converter controlled by an Arduino microcontroller based unit. The MPPT process with Perturb and Observe method is performed with a DC-DC converter circuit to overcome the problem of voltage mismatch between the PV modules and the loads. The proposing system has high efficiency, lower cost and can be easily modified to handle more energy sources. The test results indicate that the u
... Show MoreThe optimization of artificial gas lift techniques plays a crucial role in the advancement of oil field development. This study focuses on investigating the impact of gas lift design and optimization on production outcomes within the Mishrif formation of the Halfaya oil field. A comprehensive production network nodal analysis model was formulated using a PIPESIM Optimizer-based Genetic Algorithm and meticulously calibrated utilizing field-collected data from a network comprising seven wells. This well group encompasses three directional wells currently employing gas lift and four naturally producing vertical wells. To augment productivity and optimize network performance, a novel gas lift design strategy was proposed. The optimization of
... Show More