A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the results present that the improved median filter with crow optimization algorithm is more effective than the original median filter algorithm and some recently methods; they show that the suggested process is robust to reduce the error problem and remove noise because of a candidate of the median filter; the results will show by the minimized mean square error to equal or less than (1.38), absolute error to equal or less than (0.22) ,Structural Similarity (SSIM) to equal (0.9856) and getting PSNR more than (46 dB). Thus, the percentage of improvement in work is (25%).
Gaslift reactors are employed in several bioapplications due to their characteristics of cost-effectiveness and high efficiency. However, the nutrient and thermal gradient is one of the obstacles that stand in the way of its widespread use in biological applications. The diagnosis, analysis, and tracking of fluid paths in external draft tube gaslift bioreactor-type are the main topics of the current study. Several parameters were considered to assess the mixing efficiency such as downcomer-to-rizer diameter ratio (Ded/Dr), the position of the diffuser to the height of bioreactor ratio (Pd/Lr), and gas bubble size (Db). The multiple regression of liquid velocity indicates the optimal setting: Ded/Dr is (0.5), Pd/Lr is (0.02), and Db
... Show MoreTo ensure that a software/hardware product is of sufficient quality and functionality, it is essential to conduct thorough testing and evaluations of the numerous individual software components that make up the application. Many different approaches exist for testing software, including combinatorial testing and covering arrays. Because of the difficulty of dealing with difficulties like a two-way combinatorial explosion, this brings up yet another problem: time. Using client-server architectures, this research introduces a parallel implementation of the TWGH algorithm. Many studies have been conducted to demonstrate the efficiency of this technique. The findings of this experiment were used to determine the increase in speed and co
... Show MoreLow cost Co-Precipitation method was used for Preparation of novel nickel oxide (NiO) nano particle thin films with Simple, with two different PH values 6, 12 and its effect on structural and optical properties as an active optical filter. Experimental results of structural properties X-ray diffraction (XRD) showed that both Nickel oxide nanoparticles with (PH=6 and 12) have polycrystalline structure smaller average particle size about 8.5 nm for PH=6 in comparison with PH=12. Morphological studies using Scanning electron microscopy (SEM) and atomic force microscope (AFM) show uniform nano rod distribution for PH=6 with smaller average diameter, average roughness as compared with NiO with
... Show MoreAchieving reliable operation under the influence of deep-submicrometer noise sources including crosstalk noise at low voltage operation is a major challenge for network on chip links. In this paper, we propose a coding scheme that simultaneously addresses crosstalk effects on signal delay and detects up to seven random errors through wire duplication and simple parity checks calculated over the rows and columns of the two-dimensional data. This high error detection capability enables the reduction of operating voltage on the wire leading to energy saving. The results show that the proposed scheme reduces the energy consumption up to 53% as compared to other schemes at iso-reliability performance despite the increase in the overhead number o
... Show MoreText Clustering consists of grouping objects of similar categories. The initial centroids influence operation of the system with the potential to become trapped in local optima. The second issue pertains to the impact of a huge number of features on the determination of optimal initial centroids. The problem of dimensionality may be reduced by feature selection. Therefore, Wind Driven Optimization (WDO) was employed as Feature Selection to reduce the unimportant words from the text. In addition, the current study has integrated a novel clustering optimization technique called the WDO (Wasp Swarm Optimization) to effectively determine the most suitable initial centroids. The result showed the new meta-heuristic which is WDO was employed as t
... Show MoreThe university course timetable problem (UCTP) is typically a combinatorial optimization problem. Manually achieving a useful timetable requires many days of effort, and the results are still unsatisfactory. unsatisfactory. Various states of art methods (heuristic, meta-heuristic) are used to satisfactorily solve UCTP. However, these approaches typically represent the instance-specific solutions. The hyper-heuristic framework adequately addresses this complex problem. This research proposed Particle Swarm Optimizer-based Hyper Heuristic (HH PSO) to solve UCTP efficiently. PSO is used as a higher-level method that selects low-level heuristics (LLH) sequence which further generates an optimal solution. The proposed a
... Show MoreAbstract
The current research aims to examine the effectiveness of a training program for children with autism and their mothers based on the Picture Exchange Communication System to confront some basic disorders in a sample of children with autism. The study sample was (16) children with autism and their mothers in the different centers in Taif city and Tabuk city. The researcher used the quasi-experimental approach, in which two groups were employed: an experimental group and a control group. Children aged ranged from (6-9) years old. In addition, it was used the following tools: a list of estimation of basic disorders for a child with autism between (6-9) years, and a training program for children with autism
... Show MoreA Simple, rapid and sensitive extractive and spectrophotometric method has been described for the analysis of diphenhyldramine –HCl (DPH) in pure form and in pharmaceutical formulations. The method is based on the formation of chloroform soluble ion-pair complex with Bromophenol blue(BPB) in a phthalate buffer at pH 3.0.The extracted complex shows maximum absorbance at 410 nm. Beer's law is obeyed in the concentration range 0.2-25.0 µg.ml-1. The molar absorptivity and Sandell's sensitivity for the system being 2.416x104 L.mol-1.cm-1 and 0.012µg.cm-2, respectively. The limit of detection was found to be 0.155 µg.ml-1. The proposed me
... Show MoreIn this paper, the theoretical cross section in pre-equilibrium nuclear reaction has been studied for the reaction at energy 22.4 MeV. Ericson’s formula of partial level density PLD and their corrections (William’s correction and spin correction) have been substituted in the theoretical cross section and compared with the experimental data for nucleus. It has been found that the theoretical cross section with one-component PLD from Ericson’s formula when doesn’t agree with the experimental value and when . There is little agreement only at the high value of energy range with the experimental cross section. The theoretical cross section that depends on the one-component William's formula and on-component corrected to spi
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show More