This work presents a comparison between the Convolutional Encoding CE, Parallel Turbo code and Low density Parity Check (LDPC) coding schemes with a MultiUser Single Output MUSO Multi-Carrier Code Division Multiple Access (MC-CDMA) system over multipath fading channels. The decoding technique used in the simulation was iterative decoding since it gives maximum efficiency at higher iterations. Modulation schemes used is Quadrature Amplitude Modulation QAM. An 8 pilot carrier were
used to compensate channel effect with Least Square Estimation method. The channel model used is Long Term Evolution (LTE) channel with Technical Specification TS 25.101v2.10 and 5 MHz bandwidth bandwidth including the channels of indoor to outdoor/ pedestrian channel and Vehicular channel. The results showed that the performance of the proposed system was better when the LDPC was used as a coding technique
The transportation model is a well-recognized and applied algorithm in the distribution of products of logistics operations in enterprises. Multiple forms of solution are algorithmic and technological, which are applied to determine the optimal allocation of one type of product. In this research, the general formulation of the transport model by means of linear programming, where the optimal solution is integrated for different types of related products, and through a digital, dynamic, easy illustration Develops understanding of the Computer in Excel QM program. When choosing, the implementation of the form in the organization is provided.
Researches in the field of evaluation of industrial products emotionally are internationally new and non-existing in the Arabic speaking countries, which is considered the crux of the problem in the current research, in addition to the need of the designers and design students to know how to measure the emotional responses for the industrial product in order to get benefit from them in their designs. The research objective is to get a tool that uses emojis in measuring the emotional responses for the products. The researcher designed an emotional verbal wheel and emojis wheel. The sample of the research consisted of (7) chairs different in design and use, and the respondents were (89) students. The most important results are:
1- Desi
The aim of this research is analysis the effect of the changes in (GDA, g, inflation) at average and standard economic curriculum in composition of the models, depending on SPSS program in analysis, and according to available date from central bank of Iraq and during the period from 2003 to 2018 and by using OLS and estimate of the equation and the results showed a statistical significance relation in incorporeal level 5% and the R2 value equal to 92.1 refer to the changes in independent variables explain 92% of changes of unemployment and the independent variables effect are very limit depend on estimated parameters in the model and respectively (0.986,0.229,-0.060), the research recommended necessity to active the inve
... Show MoreIt is well known that the wreath product is the endmorphism monoid of a free S-act with n-generators. If S is a trivial semigroup then is isomorphic to . The extension for to where is an independent algebra has been investigated. In particular, we consider is to be , where is a free left S-act of n-generators. The eventual goal of this paper is to show that is an endomorphism monoid of a free left S-act of n-generators and to prove that is embedded in the wreath product .
In this paper, an adaptive polynomial compression technique is introduced of hard and soft thresholding of transformed residual image that efficiently exploited both the spatial and frequency domains, where the technique starts by applying the polynomial coding in the spatial domain and then followed by the frequency domain of discrete wavelet transform (DWT) that utilized to decompose the residual image of hard and soft thresholding base. The results showed the improvement of adaptive techniques compared to the traditional polynomial coding technique.
Plane cubics curves may be classified up to isomorphism or projective equivalence. In this paper, the inequivalent elliptic cubic curves which are non-singular plane cubic curves have been classified projectively over the finite field of order nineteen, and determined if they are complete or incomplete as arcs of degree three. Also, the maximum size of a complete elliptic curve that can be constructed from each incomplete elliptic curve are given.
This paper introduced an algorithm for lossless image compression to compress natural and medical images. It is based on utilizing various casual fixed predictors of one or two dimension to get rid of the correlation or spatial redundancy embedded between image pixel values then a recursive polynomial model of a linear base is used.
The experimental results of the proposed compression method are promising in terms of preserving the details and the quality of the reconstructed images as well improving the compression ratio as compared with the extracted results of a traditional linear predicting coding system.
The source and channel coding for wireless data transmission can reduce
distortion, complexity and delay in multimedia services. In this paper, a joint sourcechannel
coding is proposed for orthogonal frequency division multiplexing -
interleave division multiple access (OFDM-IDMA) systems to transmit the
compressed images over noisy channels. OFDM-IDMA combines advantages of
both OFDM and IDMA, where OFDM removes inter symbol interference (ISI)
problems and IDMA removes multiple access interference (MAI). Convolutional
coding is used as a channel coding, while the hybrid compression method is used as
a source coding scheme. The hybrid compression scheme is based on wavelet
transform, bit plane slicing, polynomi
A new blind restoration algorithm is presented and shows high quality restoration. This
is done by enforcing Wiener filtering approach in the Fourier domains of the image and the
psf environments
For several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.