A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the results present that the improved median filter with crow optimization algorithm is more effective than the original median filter algorithm and some recently methods; they show that the suggested process is robust to reduce the error problem and remove noise because of a candidate of the median filter; the results will show by the minimized mean square error to equal or less than (1.38), absolute error to equal or less than (0.22) ,Structural Similarity (SSIM) to equal (0.9856) and getting PSNR more than (46 dB). Thus, the percentage of improvement in work is (25%).
Text based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreIn networking communication systems like vehicular ad hoc networks, the high vehicular mobility leads to rapid shifts in vehicle densities, incoherence in inter-vehicle communications, and challenges for routing algorithms. It is necessary that the routing algorithm avoids transmitting the pockets via segments where the network density is low and the scale of network disconnections is high as this could lead to packet loss, interruptions and increased communication overhead in route recovery. Hence, attention needs to be paid to both segment status and traffic. The aim of this paper is to present an intersection-based segment aware algorithm for geographic routing in vehicular ad hoc networks. This algorithm makes available the best route f
... Show MoreThe cost of pile foundations is part of the super structure cost, and it became necessary to reduce this cost by studying the pile types then decision-making in the selection of the optimal pile type in terms of cost and time of production and quality .So The main objective of this study is to solve the time–cost–quality trade-off (TCQT) problem by finding an optimal pile type with the target of "minimizing" cost and time while "maximizing" quality. There are many types In the world of piles but in this paper, the researcher proposed five pile types, one of them is not a traditional, and developed a model for the problem and then employed particle swarm optimization (PSO) algorithm, as one of evolutionary algorithms with t
... Show More: Sound forecasts are essential elements of planning, especially for dealing with seasonality, sudden changes in demand levels, strikes, large fluctuations in the economy, and price-cutting manoeuvres for competition. Forecasting can help decision maker to manage these problems by identifying which technologies are appropriate for their needs. The proposal forecasting model is utilized to extract the trend and cyclical component individually through developing the Hodrick–Prescott filter technique. Then, the fit models of these two real components are estimated to predict the future behaviour of electricity peak load. Accordingly, the optimal model obtained to fit the periodic component is estimated using spectrum analysis and Fourier mod
... Show MoreWe introduce some new generalizations of some definitions which are, supra closure converge to a point, supra closure directed toward a set, almost supra converges to a set, almost supra cluster point, a set supra H-closed relative, supra closure continuous functions, supra weakly continuous functions, supra compact functions, supra rigid a set, almost supra closed functions and supra perfect functions. And we state and prove several results concerning it
Intrusion detection systems (IDS) are useful tools that help security administrators in the developing task to secure the network and alert in any possible harmful event. IDS can be classified either as misuse or anomaly, depending on the detection methodology. Where Misuse IDS can recognize the known attack based on their signatures, the main disadvantage of these systems is that they cannot detect new attacks. At the same time, the anomaly IDS depends on normal behaviour, where the main advantage of this system is its ability to discover new attacks. On the other hand, the main drawback of anomaly IDS is high false alarm rate results. Therefore, a hybrid IDS is a combination of misuse and anomaly and acts as a solution to overcome the dis
... Show MoreA three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreThe consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen
... Show MoreAbstract. In this work, some new concepts were introduced and the relationship between them was studied. These concepts are filter directed-toward, nano-closure-directed-toward and nano-closure-converge to point, and some theories and results about these concepts were presented. A definition almost-nano-converges for set, almost-nano-cluster-point, and definition of quasi-nano-Hausdorff-closed and was also called nano-Hausdorff-closed relative, were also presented several theories related to these definitions were presented and the relationship between them was studied . We also provided other generalizations, including nano closure continuous mappings and it was also called as nano-weaklycontinuous- mappings, as well as providing a definit
... Show More