A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the results present that the improved median filter with crow optimization algorithm is more effective than the original median filter algorithm and some recently methods; they show that the suggested process is robust to reduce the error problem and remove noise because of a candidate of the median filter; the results will show by the minimized mean square error to equal or less than (1.38), absolute error to equal or less than (0.22) ,Structural Similarity (SSIM) to equal (0.9856) and getting PSNR more than (46 dB). Thus, the percentage of improvement in work is (25%).
conventional FCM algorithm does not fully utilize the spatial information in the image. In this research, we use a FCM algorithm that incorporates spatial information into the membership function for clustering. The spatial function is the summation of the membership functions in the neighborhood of each pixel under consideration. The advantages of the method are that it is less
sensitive to noise than other techniques, and it yields regions more homogeneous than those of other methods. This technique is a powerful method for noisy image segmentation.
Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye
... Show MoreBackground: Obesity tends to appear in modern societies and constitutes a significant public health problem with an increased risk of cardiovascular diseases.
Objective: This study aims to determine the agreement between actual and perceived body image in the general population.
Methods: A descriptive cross-sectional study design was conducted with a sample size of 300. The data were collected from eight major populated areas of Northern district of Karachi Sindh with a period of six months (10th January 2020 to 21st June 2020). The Figure rating questionnaire scale (FRS) was applied to collect the demographic data and perception about body weight. Body mass index (BMI) used for ass
... Show MoreThe laser micro-cutting process is the most widely commonly applied machining process which can be applied to practically all metallic and non-metallic materials. While this had challenges in cutting quality criteria such as geometrical precision, surface quality and numerous others. This article investigates the laser micro-cutting of PEEK composite material using nano-fiber laser, due to their significant importunity and efficiency of laser in various manufacturing processes. Design of experiential tool based on Response Surface Methodology (RSM)-Central Composite Design (CCD) used to generate the statistical model. This method was employed to analysis the influence of parameters including laser speed,
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show More
The data communication has been growing in present day. Therefore, the data encryption became very essential in secured data transmission and storage and protecting data contents from intruder and unauthorized persons. In this paper, a fast technique for text encryption depending on genetic algorithm is presented. The encryption approach is achieved by the genetic operators Crossover and mutation. The encryption proposal technique based on dividing the plain text characters into pairs, and applying the crossover operation between them, followed by the mutation operation to get the encrypted text. The experimental results show that the proposal provides an important improvement in encryption rate with comparatively high-speed Process
... Show More