A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the results present that the improved median filter with crow optimization algorithm is more effective than the original median filter algorithm and some recently methods; they show that the suggested process is robust to reduce the error problem and remove noise because of a candidate of the median filter; the results will show by the minimized mean square error to equal or less than (1.38), absolute error to equal or less than (0.22) ,Structural Similarity (SSIM) to equal (0.9856) and getting PSNR more than (46 dB). Thus, the percentage of improvement in work is (25%).
I
In this study, optical fibers were designed and implemented as a chemical sensor based on surface plasmon resonance (SPR) to estimate the age of the oil used in electrical transformers. The study depends on the refractive indices of the oil. The sensor was created by embedding the center portion of the optical fiber in a resin block, followed by polishing, and tapering to create the optical fiber sensor. The tapering time was 50 min. The multi-mode optical fiber was coated with 60 nm thickness gold metal. The deposition length was 4 cm. The sensor's resonance wavelength was 415 nm. The primary sensor parameters were calculated, including sensitivity (6.25), signal-to-noise ratio (2.38), figure of merit (4.88), and accuracy (3.2)
... Show MoreBackground:Measurement of hemoglobin A1c (A1C) is a renowned tactic for gauging long-term glycemic control, and exemplifies an outstanding influence to the quality of care in diabetic patients.The concept of targets is open to criticism; they may be unattainable, or limit what could be attained, and in addition they may be economically difficult to attain. However, without some form of targeted control of an asymptomatic condition it becomes difficult to promote care at allObjectives: The present article aims to address the most recent evidence-based global guidelines of A1C targets intended for glycemic control in Type 2 Diabetes Mellitus (T2D).Key messages:Rationale for Treatment Targets of A1C includesevidence for microvascular and ma
... Show MoreWith the increasing integration of computers and smartphones into our daily lives, in addition to the numerous benefits it offers over traditional paper-based methods of conducting affairs, it has become necessary to incorporate one of the most essential facilities into this integration; namely: colleges. The traditional approach for conducting affairs in colleges is mostly paper-based, which only increases time and workload and is relatively decentralized. This project provides educational and management services for the university environment, targeting the staff, the student body, and the lecturers, on two of the most used platforms: smartphones and reliable web applications by clo
This abstract focuses on the significance of wireless body area networks (WBANs) as a cutting-edge and self-governing technology, which has garnered substantial attention from researchers. The central challenge faced by WBANs revolves around upholding quality of service (QoS) within rapidly evolving sectors like healthcare. The intricate task of managing diverse traffic types with limited resources further compounds this challenge. Particularly in medical WBANs, the prioritization of vital data is crucial to ensure prompt delivery of critical information. Given the stringent requirements of these systems, any data loss or delays are untenable, necessitating the implementation of intelligent algorithms. These algorithms play a pivota
... Show MoreDuring COVID-19, wearing a mask was globally mandated in various workplaces, departments, and offices. New deep learning convolutional neural network (CNN) based classifications were proposed to increase the validation accuracy of face mask detection. This work introduces a face mask model that is able to recognize whether a person is wearing mask or not. The proposed model has two stages to detect and recognize the face mask; at the first stage, the Haar cascade detector is used to detect the face, while at the second stage, the proposed CNN model is used as a classification model that is built from scratch. The experiment was applied on masked faces (MAFA) dataset with images of 160x160 pixels size and RGB color. The model achieve
... Show MoreComputer models are used in the study of electrocardiography to provide insight into physiological phenomena that are difficult to measure in the lab or in a clinical environment.
The electrocardiogram is an important tool for the clinician in that it changes characteristically in a number of pathological conditions. Many illnesses can be detected by this measurement. By simulating the electrical activity of the heart one obtains a quantitative relationship between the electrocardiogram and different anomalies.
Because of the inhomogeneous fibrous structure of the heart and the irregular geometries of the body, finite element method is used for studying the electrical properties of the heart.
This work describes t
... Show More