Krawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the initial value of the KP parameter. In addition, a new diagonal recurrence relation is introduced and used in the proposed algorithm. The diagonal recurrence algorithm was derived from the existing n direction and x direction recurrence algorithms. The diagonal and existing recurrence algorithms were subsequently exploited to compute the KP coefficients. First, the KP coefficients were computed for one partition after dividing the KP plane into four. To compute the KP coefficients in the other partitions, the symmetry relations were exploited. The performance evaluation of the proposed recurrence algorithm was determined through different comparisons which were carried out in state-of-the-art works in terms of reconstruction error, polynomial size, and computation cost. The obtained results indicate that the proposed algorithm is reliable and computes lesser coefficients when compared to the existing algorithms across wide ranges of parameter values of p and polynomial sizes N. The results also show that the improvement ratio of the computed coefficients ranges from 18.64% to 81.55% in comparison to the existing algorithms. Besides this, the proposed algorithm can generate polynomials of an order ∼8.5 times larger than those generated using state-of-the-art algorithms.
Abstract
Suffering the human because of pressure normal life of exposure to several types of heart disease as a result of due to different factors. Therefore, and in order to find out the case of a death whether or not, are to be modeled using binary logistic regression model
In this research used, one of the most important models of nonlinear regression models extensive use in the modeling of applications statistical, in terms of heart disease which is the binary logistic regression model. and then estimating the parameters of this model using the statistical estimation methods, another problem will be appears in estimating its parameters, as well as when the numbe
... Show MoreObesity-related deaths continue to rise, and thus losing weight in overweight and obese patients is critical to prevent complications. Anredera cordifolia (Ten,) Steenis, species of succulent plant of the genus Basellaceae, is widely used in herbal medicine to decrease body weight. This study evaluated the potential benefits of Anredera cordifolia ethanol extract to reduce body weight in high-fat diet-induced obesity rat model. This was an experimental with post-test only control group design study involving 36 obese rats. They were divided into two groups: three control groups (K1, K2, K3) and three treatment groups (P1, P2, P3). All the groups were induced with high-fat diet, except K1 control group that received a standard di
... Show MoreReading is one of the essential components of the English language. Countries that use English as a second language (ESL) sometimes have difficulties in reading and comprehension. According to many researches, mother tongue has proved some interferences with learning a second language. This study investigated the results of reading difficulties of young second language learners in terms of accuracy, comprehension, and rate using the Neale Analysis of Reading Ability test. The study was carried out in one of the High Schools for Boys in Hyderabad, India and included Grade five, aged 10-12 years. In order to understand the reading difficulties of English as a second language, a qualitative approach was employed. Interview, reading tes
... Show MoreLand Use / Land Cover (LULC) classification is considered one of the basic tasks that decision makers and map makers rely on to evaluate the infrastructure, using different types of satellite data, despite the large spectral difference or overlap in the spectra in the same land cover in addition to the problem of aberration and the degree of inclination of the images that may be negatively affect rating performance. The main objective of this study is to develop a working method for classifying the land cover using high-resolution satellite images using object based method. Maximum likelihood pixel based supervised as well as object approaches were examined on QuickBird satellite image in Karbala, Iraq. This study illustrated that
... Show MoreThis work bases on encouraging a generous and conceivable estimation for modified an algorithm for vehicle travel times on a highway from the eliminated traffic information using set aside camera image groupings. The strategy for the assessment of vehicle travel times relies upon the distinctive verification of traffic state. The particular vehicle velocities are gotten from acknowledged vehicle positions in two persistent images by working out the distance covered all through elapsed past time doing mollification between the removed traffic flow data and cultivating a plan to unequivocally predict vehicle travel times. Erbil road data base is used to recognize road locales around road segments which are projected into the commended camera
... Show MoreMany of accurate inertial guided missilc systems need to use more complex mathematical calculations and require a high speed processing to ensure the real-time opreation. This will give rise to the need of developing an effcint
The cuneiform images need many processes in order to know their contents
and by using image enhancement to clarify the objects (symbols) founded in the
image. The Vector used for classifying the symbol called symbol structural vector
(SSV) it which is build from the information wedges in the symbol.
The experimental tests show insome numbersand various relevancy including
various drawings in online method. The results are high accuracy in this research,
and methods and algorithms programmed using a visual basic 6.0. In this research
more than one method was applied to extract information from the digital images
of cuneiform tablets, in order to identify most of signs of Sumerian cuneiform.
Background: Sprite coding is a very effective technique for clarifying the background video object. The sprite generation is an open issue because of the foreground objects which prevent the precision of camera motion estimation and blurs the created sprite. Objective: In this paper, a quick and basic static method for sprite area detection in video data is presented. Two statistical methods are applied; the mean and standard deviation of every pixel (over all group of video frame) to determine whether the pixel is a piece of the selected static sprite range or not. A binary map array is built for demonstrating the allocated sprite (as 1) while the non-sprite (as 0) pixels valued. Likewise, holes and gaps filling strategy was utilized to re
... Show MoreColor image compression is a good way to encode digital images by decreasing the number of bits wanted to supply the image. The main objective is to reduce storage space, reduce transportation costs and maintain good quality. In current research work, a simple effective methodology is proposed for the purpose of compressing color art digital images and obtaining a low bit rate by compressing the matrix resulting from the scalar quantization process (reducing the number of bits from 24 to 8 bits) using displacement coding and then compressing the remainder using the Mabel ZF algorithm Welch LZW. The proposed methodology maintains the quality of the reconstructed image. Macroscopic and