In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreMassive multiple-input multiple-output (MaMi) systems have attracted much research attention during the last few years. This is because MaMi systems are able to achieve a remarkable improvement in data rate and thus meet the immensely ongoing traffic demands required by the future wireless networks. To date, the downlink training sequence (DTS) for the frequency division duplex (FDD) MaMi communications systems have been designed based on the idealistic assumption of white noise environments. However, it is essential and more practical to consider the colored noise environments when designing an efficient DTS for channel estimation. To this end, this paper proposes a new DTS design by exploring the joint use of spatial channel and n
... Show MoreThe current research is concerned with studying the variables in the promotion process which influence the advertisement design structure, as the accomplished design and construction process is subject to many variables, whether they were intellectual or technological, internal or external variables. These variables may overlap in order to get a comprehensive system for the artistic configuration, that any design in its content reaches the highest levels of perfection is connected to the extent of its compliance with and approximation to these variables, that is why we find their reflections deeply rooted in the individual's mind, especially the designer artist who is influenced by everything surrounding him forming knowledge systems res
... Show MoreThe theory of probabilistic programming may be conceived in several different ways. As a method of programming it analyses the implications of probabilistic variations in the parameter space of linear or nonlinear programming model. The generating mechanism of such probabilistic variations in the economic models may be due to incomplete information about changes in demand, production and technology, specification errors about the econometric relations presumed for different economic agents, uncertainty of various sorts and the consequences of imperfect aggregation or disaggregating of economic variables. In this Research we discuss the probabilistic programming problem when the coefficient bi is random variable
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreStatisticians often use regression models like parametric, nonparametric, and semi-parametric models to represent economic and social phenomena. These models explain the relationships between different variables in these phenomena. One of the parametric model techniques is conic projection regression. It helps to find the most important slopes for multidimensional data using prior information about the regression's parameters to estimate the most efficient estimator. R algorithms, written in the R language, simplify this complex method. These algorithms are based on quadratic programming, which makes the estimations more accurate.
This article studies a comprehensive methods of edge detection and algorithms in digital images which is reflected a basic process in the field of image processing and analysis. The purpose of edge detection technique is discovering the borders that distinct diverse areas of an image, which donates to refining the understanding of the image contents and extracting structural information. The article starts by clarifying the idea of an edge and its importance in image analysis and studying the most noticeable edge detection methods utilized in this field, (e.g. Sobel, Prewitt, and Canny filters), besides other schemes based on distinguishing unexpected modifications in light intensity and color gradation. The research as well discuss
... Show MoreRadiation therapy plays an important role in improving breast cancer cases, in order to obtain an appropriateestimate of radiation doses number given to the patient after tumor removal; some methods of nonparametric regression werecompared. The Kernel method was used by Nadaraya-Watson estimator to find the estimation regression function forsmoothing data based on the smoothing parameter h according to the Normal scale method (NSM), Least Squared CrossValidation method (LSCV) and Golden Rate Method (GRM). These methods were compared by simulation for samples ofthree sizes, the method (NSM) proved to be the best according to average of Mean Squares Error criterion and the method(LSCV) proved to be the best according to Average of Mean Absolu
... Show MoreThis paper considers approximate solution of the hyperbolic one-dimensional wave equation with nonlocal mixed boundary conditions by improved methods based on the assumption that the solution is a double power series based on orthogonal polynomials, such as Bernstein, Legendre, and Chebyshev. The solution is ultimately compared with the original method that is based on standard polynomials by calculating the absolute error to verify the validity and accuracy of the performance.