The integer simulation and development finite impulse response (FIR) filters taking into account the possibilities of their realization on digital integer platforms are considered. The problem statement and solution of multifunctional synthesis of digital FIR filters such a problem on the basis of the numerical methods of integer nonlinear mathematical programming are given. As an several examples, the problem solution of synthesis FIR-filters with short coefficient word length has been given. The analysis of their characteristics is resulted. The paper discusses issues of modeling and synthesis of digital FIR filters with provision for the possibilities of their implementation on digital platforms with integer computation arithmetic. The formulation of the problem of multifunctional synthesis of cascade FIR filters using the methods of integer nonlinear mathematical programming is given. The efficiency of this approach is illustrated by examples of solving the problems of synthesizing integer FIR filters with a minimum coefficient word length. The analysis of the synthesized filter characteristics is made.
Finding similarities in texts is important in many areas such as information retrieval, automated article scoring, and short answer categorization. Evaluating short answers is not an easy task due to differences in natural language. Methods for calculating the similarity between texts depend on semantic or grammatical aspects. This paper discusses a method for evaluating short answers using semantic networks to represent the typical (correct) answer and students' answers. The semantic network of nodes and relationships represents the text (answers). Moreover, grammatical aspects are found by measuring the similarity of parts of speech between the answers. In addition, finding hierarchical relationships between nodes in netwo
... Show MoreImage retrieval is an active research area in image processing, pattern recognition, and
computer vision. In this proposed method, there are two techniques to extract the feature
vector, the first one is applying the transformed algorithm on the whole image and the second
is to divide the image into four blocks and then applying the transform algorithm on each part
of the image. In each technique there are three transform algorithm that have been applied
(DCT, Walsh Transform, and Kekre’s Wavelet Transform) then finding the similarity and
indexing the images, useing the correlation between feature vector of the query image and
images in database. The retrieved method depends on higher indexing number. <
Linear attenuation coefficient of polymer composite for beta particles and bremsstrahlung ray were investigated as a function of the absorber thickness and energy. The attenuation coefficient were obtained using NaI(Tl) energy selective scintillation counter with 90Sr/90Y beta source having an energy range from 0.1-1.1 MeV. The present results show the capability of this composite to absorber beta particles and bremsstrahlung ray that yield from it. That’s mean it is useful to choice this composite for radiation shielding of beta ray with low thickness.
In the field of data security, the critical challenge of preserving sensitive information during its transmission through public channels takes centre stage. Steganography, a method employed to conceal data within various carrier objects such as text, can be proposed to address these security challenges. Text, owing to its extensive usage and constrained bandwidth, stands out as an optimal medium for this purpose. Despite the richness of the Arabic language in its linguistic features, only a small number of studies have explored Arabic text steganography. Arabic text, characterized by its distinctive script and linguistic features, has gained notable attention as a promising domain for steganographic ventures. Arabic text steganography harn
... Show More
Students’ feedback is crucial for educational institutions to assess the performance of their teachers, most opinions are expressed in their native language, especially for people in south Asian regions. In Pakistan, people use Roman Urdu to express their reviews, and this applied in the education domain where students used Roman Urdu to express their feedback. It is very time-consuming and labor-intensive process to handle qualitative opinions manually. Additionally, it can be difficult to determine sentence semantics in a text that is written in a colloquial style like Roman Urdu. This study proposes an enhanced word embedding technique and investigates the neural word Embedding (Word2Vec and Glove) to determine which perfo
... Show MoreThis dissertation studies the application of equivalence theory developed by Mona Baker in translating Persian to Arabic. Among various translation methodologies, Mona Baker’s bottom-up equivalency approach is unique in several ways. Baker’s translation approach is a multistep process. It starts with studying the smallest linguistic unit, “the word”, and then evolves above the level of words leading to the translation of the entire text. Equivalence at the word level, i.e., word for word method, is the core point of Baker’s approach.
This study evaluates the use of Baker’s approach in translation from Persian to Arabic, mainly because finding the correct equivalence is a major challenge in this translation. Additionall
... Show MoreThe aim of this paper is to compare between classical and fuzzy filters for removing different types of noise in gray scale images. The processing used consists of three steps. First, different types of noise are added to the original image to produce a noisy image (with different noise ratios). Second, classical and fuzzy filters are used to filter the noisy image. Finally, comparing between resulting images depending on a quantitative measure called Peak Signal-to-Noise Ratio (PSNR) to determine the best filter in each case.
The image used in this paper is a 512 * 512 pixel and the size of all filters is a square window of size 3*3. Results indicate that fuzzy filters achieve varying successes in noise reduction in image compared to
In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreBased on a finite element analysis using Matlab coding, eigenvalue problem has been formulated and solved for the buckling analysis of non-prismatic columns. Different numbers of elements per column length have been used to assess the rate of convergence for the model. Then the proposed model has been used to determine the critical buckling load factor () for the idealized supported columns based on the comparison of their buckling loads with the corresponding hinge supported columns . Finally in this study the critical buckling factor () under end force (P) increases by about 3.71% with the tapered ratio increment of 10% for different end supported columns and the relationship between normalized critical load and slenderness ratio was g
... Show More