Image compression has become one of the most important applications of the image processing field because of the rapid growth in computer power. The corresponding growth in the multimedia market, and the advent of the World Wide Web, which makes the internet easily accessible for everyone. Since the early 1980, digital image sequence processing has been an attractive research area because an image sequence, as acollection of images, may provide much compression than a single image frame. The increased computational complexity and memory space required for image sequence processing, has in fact, becoming more attainable. this research absolute Moment Block Truncation compression technique which is depend on adopting the good points of other techniques. As well as algorithm of efficient block's position has been adopted to achieve this research. Also in this paper was introduce A modify of the orthogonal search algorithm (OSA) for searching scheme has been introduced which is contributed in decreasing the motion searching time of the successive inter frames.
The widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the ch
... Show MoreMost recognition system of human facial emotions are assessed solely on accuracy, even if other performance criteria are also thought to be important in the evaluation process such as sensitivity, precision, F-measure, and G-mean. Moreover, the most common problem that must be resolved in face emotion recognition systems is the feature extraction methods, which is comparable to traditional manual feature extraction methods. This traditional method is not able to extract features efficiently. In other words, there are redundant amount of features which are considered not significant, which affect the classification performance. In this work, a new system to recognize human facial emotions from images is proposed. The HOG (Histograms of Or
... Show MoreFluoroscopic images are a field of medical images that depends on the quality of image for correct diagnosis; the main trouble is the de-nosing and how to keep the poise between degradation of noisy image, from one side, and edge and fine details preservation, from the other side, especially when fluoroscopic images contain black and white type noise with high density. The previous filters could usually handle low/medium black and white type noise densities, that expense edge, =fine details preservation and fail with high density of noise that corrupts the images. Therefore, this paper proposed a new Multi-Line algorithm that deals with high-corrupted image with high density of black and white type noise. The experiments achieved i
... Show MoreThe shape for even-even (54Xe 118≤ A ≤ 140 and 82Pb 204 ≤ A ≤ 210 ) nuclei have been studied and investigated through the deformation parameters and δ , these deformation parameters were calculated by two different methods. The first one is nucleus quadrupole deformation parameter β2 from reduced transition probability B(E2)↑ for 0+→2+1 transitions and the second is nucleus quadrupole deformation parameters δ from quadrupole moment Qo.The relationship between two deformation parameters ( , ) and neutrons magic number (N=82 & 126) was studied through plotting the deformation parameters ( , ) as a function of neutrons number , from this relationship we can see very cleary that the deformation of nucleus decreased when th
... Show MoreThis Book is the second edition that intended to be textbook studied for undergraduate/ postgraduate course in mathematical statistics. In order to achieve the goals of the book, it is divided into the following chapters. Chapter One introduces events and probability review. Chapter Two devotes to random variables in their two types: discrete and continuous with definitions of probability mass function, probability density function and cumulative distribution function as well. Chapter Three discusses mathematical expectation with its special types such as: moments, moment generating function and other related topics. Chapter Four deals with some special discrete distributions: (Discrete Uniform, Bernoulli, Binomial, Poisson, Geometric, Neg
... Show MoreSome maps of the chaotic firefly algorithm were selected to select variables for data on blood diseases and blood vessels obtained from Nasiriyah General Hospital where the data were tested and tracking the distribution of Gamma and it was concluded that a Chebyshevmap method is more efficient than a Sinusoidal map method through mean square error criterion.
In this paper, we estimate the survival function for the patients of lung cancer using different nonparametric estimation methods depending on sample from complete real data which describe the duration of survivor for patients who suffer from the lung cancer based on diagnosis of disease or the enter of patients in a hospital for period of two years (starting with 2012 to the end of 2013). Comparisons between the mentioned estimation methods has been performed using statistical indicator mean squares error, concluding that the survival function for the lung cancer by using shrinkage method is the best
In this study, different methods were used for estimating location parameter and scale parameter for extreme value distribution, such as maximum likelihood estimation (MLE) , method of moment estimation (ME),and approximation estimators based on percentiles which is called white method in estimation, as the extreme value distribution is one of exponential distributions. Least squares estimation (OLS) was used, weighted least squares estimation (WLS), ridge regression estimation (Rig), and adjusted ridge regression estimation (ARig) were used. Two parameters for expected value to the percentile as estimation for distribution f
... Show MoreThis paper proposes improving the structure of the neural controller based on the identification model for nonlinear systems. The goal of this work is to employ the structure of the Modified Elman Neural Network (MENN) model into the NARMA-L2 structure instead of Multi-Layer Perceptron (MLP) model in order to construct a new hybrid neural structure that can be used as an identifier model and a nonlinear controller for the SISO linear or nonlinear systems. Two learning algorithms are used to adjust the parameters weight of the hybrid neural structure with its serial-parallel configuration; the first one is supervised learning algorithm based Back Propagation Algorithm (BPA) and the second one is an intelligent algorithm n
... Show MoreMulti-document summarization is an optimization problem demanding optimization of more than one objective function simultaneously. The proposed work regards balancing of the two significant objectives: content coverage and diversity when generating summaries from a collection of text documents.
Any automatic text summarization system has the challenge of producing high quality summary. Despite the existing efforts on designing and evaluating the performance of many text summarization techniques, their formulations lack the introduction of any model that can give an explicit representation of – coverage and diversity – the two contradictory semantics of any summary. In this work, the design of
... Show More