Numeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential application in more realistic noise environments. Therefore, finding a feasible and accurate handwritten numeral recognition method that is accurate in the more practical noisy environment is crucial. To this end, this paper proposes a new scheme for handwritten numeral recognition using Hybrid orthogonal polynomials. Gradient and smoothed features are extracted using the hybrid orthogonal polynomial. To reduce the complexity of feature extraction, the embedded image kernel technique has been adopted. In addition, support vector machine is used to classify the extracted features for the different numerals. The proposed scheme is evaluated under three different numeral recognition datasets: Roman, Arabic, and Devanagari. We compare the accuracy of the proposed numeral recognition method with the accuracy achieved by the state-of-the-art recognition methods. In addition, we compare the proposed method with the most updated method of a convolutional neural network. The results show that the proposed method achieves almost the highest recognition accuracy in comparison with the existing recognition methods in all the scenarios considered. Importantly, the results demonstrate that the proposed method is robust against the noise distortion and outperforms the convolutional neural network considerably, which signifies the feasibility and the effectiveness of the proposed approach in comparison to the state-of-the-art recognition methods under both clean noise and more realistic noise environments.
In this study, the quality assurance of the linear accelerator available at the Baghdad Center for Radiation Therapy and Nuclear Medicine was verified using Star Track and Perspex. The study was established from August to December 2018. This study showed that there was an acceptable variation in the dose output of the linear accelerator. This variation was ±2% and it was within the permissible range according to the recommendations of the manufacturer of the accelerator (Elkta).
Photocatalytic degradation of methylene blue was studied using CdS and ZnS as catalyst. The photocatalytic activity of the specimen was studied by exposing to UV-radiation. The result shows that the degradation efficiency of the dye for CdS micro-particles was 92% after 7 hours and for ZnS micro-particles was 88.29% for the same time interval.
Numerical simulations are carried out to assess the quality of the circular and square apodize apertures in observing extrasolar planets. The logarithmic scale of the normalized point spread function of these apertures showed sharp decline in the radial frequency components reaching to 10-36 and 10-34 respectively and demonstrating promising results. This decline is associated with an increase in the full width of the point spread function. A trade off must be done between this full width and the radial frequency components to overcome the problem of imaging extrasolar planets.
The knowledge related with lexical items can be realized as including relations of meaning a cross words. Words that share a similarity of meaning are called to be synonymous, and words that share a contrary of meaning are called to be antonymous. Both of them are universal linguistic phenomenon that exist in terms of linguistic system of every language. The present study aims at finding out areas of difficulty that Iraqi EFL learners encounter in the use of synonymy and antonymy, both on the recognition and production levels. Also tries to detect the main reasons behind such difficulties. A diagnostic test of two parts, namely, recognition and production, is designed. The test is built to include two linguistic phenomenon which are: synony
... Show MoreMeloxicam (MLX) is non-steroidal anti -inflammatory, poorly water soluble, highly permeable drug and the rate of its oral absorption is often controlled by the dissolution rate in the gastrointestinal tract. Solid dispersion (SD) is an effective technique for enhancing the solubility and dissolution rate of such drug.
The present study aims to enhance the solubility and the dissolution rate of MLX by SD technique by solvent evaporation method using sodium alginate (SA), hyaluronic acid (HA), collagen and xyloglucan (XG) as gastro-protective hydrophilic natural polymers.
Twelve formulas were prepared in different drug: polymer ratios and evaluated for their, percentage yield, drug content, water so
... Show MoreRetinopathy of prematurity (ROP) can cause blindness in premature neonates. It is diagnosed when new blood vessels form abnormally in the retina. However, people at high risk of ROP might benefit significantly from early detection and treatment. Therefore, early diagnosis of ROP is vital in averting visual impairment. However, due to a lack of medical experience in detecting this condition, many people refuse treatment; this is especially troublesome given the rising cases of ROP. To deal with this problem, we trained three transfer learning models (VGG-19, ResNet-50, and EfficientNetB5) and a convolutional neural network (CNN) to identify the zones of ROP in preterm newborns. The dataset to train th
Optimizing system performance in dynamic and heterogeneous environments and the efficient management of computational tasks are crucial. This paper therefore looks at task scheduling and resource allocation algorithms in some depth. The work evaluates five algorithms: Genetic Algorithms (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Firefly Algorithm (FA) and Simulated Annealing (SA) across various workloads achieved by varying the task-to-node ratio. The paper identifies Finish Time and Deadline as two key performance metrics for gauging the efficacy of an algorithm, and a comprehensive investigation of the behaviors of these algorithms across different workloads was carried out. Results from the experiment
... Show MoreIn this research two algorithms are applied, the first is Fuzzy C Means (FCM) algorithm and the second is hard K means (HKM) algorithm to know which of them is better than the others these two algorithms are applied on a set of data collected from the Ministry of Planning on the water turbidity of five areas in Baghdad to know which of these areas are less turbid in clear water to see which months during the year are less turbid in clear water in the specified area.