Numeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential application in more realistic noise environments. Therefore, finding a feasible and accurate handwritten numeral recognition method that is accurate in the more practical noisy environment is crucial. To this end, this paper proposes a new scheme for handwritten numeral recognition using Hybrid orthogonal polynomials. Gradient and smoothed features are extracted using the hybrid orthogonal polynomial. To reduce the complexity of feature extraction, the embedded image kernel technique has been adopted. In addition, support vector machine is used to classify the extracted features for the different numerals. The proposed scheme is evaluated under three different numeral recognition datasets: Roman, Arabic, and Devanagari. We compare the accuracy of the proposed numeral recognition method with the accuracy achieved by the state-of-the-art recognition methods. In addition, we compare the proposed method with the most updated method of a convolutional neural network. The results show that the proposed method achieves almost the highest recognition accuracy in comparison with the existing recognition methods in all the scenarios considered. Importantly, the results demonstrate that the proposed method is robust against the noise distortion and outperforms the convolutional neural network considerably, which signifies the feasibility and the effectiveness of the proposed approach in comparison to the state-of-the-art recognition methods under both clean noise and more realistic noise environments.
Root research requires high throughput phenotyping methods that provide meaningful information on root depth if the full potential of the genomic revolution is to be translated into strategies that maximise the capture of water deep in soils by crops. A very simple, low cost method of assessing root depth of seedlings using a layer of herbicide (
A Novel artificial neural network (ANN) model was constructed for calibration of a multivariate model for simultaneously quantitative analysis of the quaternary mixture composed of carbamazepine, carvedilol, diazepam, and furosemide. An eighty-four mixing formula where prepared and analyzed spectrophotometrically. Each analyte was formulated in six samples at different concentrations thus twenty four samples for the four analytes were tested. A neural network of 10 hidden neurons was capable to fit data 100%. The suggested model can be applied for the quantitative chemical analysis for the proposed quaternary mixture.
The Present study investigated the drought in Iraq, by using the rainfall data which obtained from 39 meteorological stations for the past 30 years (1980-2010). The drought coefficient calculated on basis of the standard precipitation index (SPI) and then characteristics of drought magnitude, duration and intensity were analyzed. The correlation and regression between magnitude and duration of drought were obtained according the (SPI) index. The result shows that drought magnitude values were greater in the northeast region of Iraq.
Forest fires continue to rise during the dry season and they are difficult to stop. In this case, high temperatures in the dry season can cause an increase in drought index that could potentially burn the forest every time. Thus, the government should conduct surveillance throughout the dry season. Continuous surveillance without the focus on a particular time becomes ineffective and inefficient because of preventive measures carried out without the knowledge of potential fire risk. Based on the Keetch-Byram Drought Index (KBDI), formulation of Drought Factor is used just for calculating the drought today based on current weather conditions, and yesterday's drought index. However, to find out the factors of drought a day after, the data
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
This research deals with the use of a number of statistical methods, such as the kernel method, watershed, histogram and cubic spline, to improve the contrast of digital images. The results obtained according to the RSME and NCC standards have proven that the spline method is the most accurate in the results compared to other statistical methods
Background: One of the most common problems that encountered is postburn contracture which has both functional and aesthetic impact on the patients. Various surgical methods had being proposed to treat such problem. Aim: To evaluate the effectiveness of square flap in management of postburn contracture in several part of the body. Patients and methods: From April 2019 to June 2020 a total number of 20 patients who had postburn contracture in various parts of their body were subjected to scar contracture release using square flap. The follow up period was ranging between 6 months to 12 months. Results: All of our patients had achieved complete release of their band with maximum postoperative motion together with accepted aesthetic outcome. A
... Show MoreRoot-finding is an oldest classical problem, which is still an important research topic, due to its impact on computational algebra and geometry. In communications systems, when the impulse response of the channel is minimum phase the state of equalization algorithm is reduced and the spectral efficiency will improved. To make the channel impulse response minimum phase the prefilter which is called minimum phase filter is used, the adaptation of the minimum phase filter need root finding algorithm. In this paper, the VHDL implementation of the root finding algorithm introduced by Clark and Hau is introduced.
VHDL program is used in the work, to find the roots of two channels and make them minimum phase, the obtained output results are