Simulation experiments are a means of solving in many fields, and it is the process of designing a model of the real system in order to follow it and identify its behavior through certain models and formulas written according to a repeating software style with a number of iterations. The aim of this study is to build a model that deals with the behavior suffering from the state of (heteroskedasticity) by studying the models (APGARCH & NAGARCH) using (Gaussian) and (Non-Gaussian) distributions for different sample sizes (500,1000,1500,2000) through the stage of time series analysis (identification , estimation, diagnostic checking and prediction). The data was generated using the estimations of the parameters resulting from the application of these models to the return series for the exchange rates of Iraqi dinar against US dollar (IQ/USD) for the period from (21/7/2011) until (21/07/2021) and then using these estimations in the process of generating data. The identifications were made using the (Ljung-Box and ARCH tests) with (1000 replicates) and the result showed the presence of states (autocorrelation and heteroskedasticity) and this states increased with increasing the sample size and the best result of NAGARCH with Normal distribution and the best result of APGARCH with General error distribution. The Maximum Likelihood Estimation method used to estimate the parameters of the models and the best result with largest sample size (2000) , in the diagnostic checking phase the result showed the ability of the models (NAGARCH & APGARCH) to process the states of (autocorrelation and heteroskedasticity) and the best result with (APGARCH) model when the error distributed (General error distribution)
A new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThis paper examines the use of one of the most common linguistic devices which is hyperbole. It shows how hyperbolic devices are used as an aspect of exaggeration or overstatement for an extra effect in which the speaker can use hyperbole to add something extra to a situation in order to exaggerate his idea or speech. It is, like other figures of speech, used to express a negative or positive attitude of a specific unit of language. Thus, this paper is set against a background of using hyperbole concerning two main fields (advertisements and propaganda). So, the use of hyperbole will be implied by analyzing them concerning their meaning) literal and non-literal). Methodology of this
... Show MoreThe analysis of the classic principal components are sensitive to the outliers where they are calculated from the characteristic values and characteristic vectors of correlation matrix or variance Non-Robust, which yields an incorrect results in the case of these data contains the outliers values. In order to treat this problem, we resort to use the robust methods where there are many robust methods Will be touched to some of them.
The robust measurement estimators include the measurement of direct robust estimators for characteristic values by using characteristic vectors without relying on robust estimators for the variance and covariance matrices. Also the analysis of the princ
... Show MoreThe present study investigates the use of intensifiers as linguisticdevices employed by Charles Dickens in Hard Times. For ease of analysis, the data are obtained by a rigorous observation of spontaneously occurring intensifiers in the text. The study aims at exploring the pragmatic functions and aesthetic impact of using intensifiers in Hard Times.The current study is mainly descriptive analytical and is based on analyzing and interpreting the use of intensifiers in terms ofHolmes (1984) andCacchiani’smodel (2009). From the findings, the novelist overuses intensifiers to the extent that 280 intensifiers are used in the text. These intensifiers(218) are undistinguished
... Show MoreAs the process of estimate for model and variable selection significant is a crucial process in the semi-parametric modeling At the beginning of the modeling process often At there are many explanatory variables to Avoid the loss of any explanatory elements may be important as a result , the selection of significant variables become necessary , so the process of variable selection is not intended to simplifying model complexity explanation , and also predicting. In this research was to use some of the semi-parametric methods (LASSO-MAVE , MAVE and The proposal method (Adaptive LASSO-MAVE) for variable selection and estimate semi-parametric single index model (SSIM) at the same time .
... Show MoreIn this paper, a fusion of K models of full-rank weighted nonnegative tensor factor two-dimensional deconvolution (K-wNTF2D) is proposed to separate the acoustic sources that have been mixed in an underdetermined reverberant environment. The model is adapted in an unsupervised manner under the hybrid framework of the generalized expectation maximization and multiplicative update algorithms. The derivation of the algorithm and the development of proposed full-rank K-wNTF2D will be shown. The algorithm also encodes a set of variable sparsity parameters derived from Gibbs distribution into the K-wNTF2D model. This optimizes each sub-model in K-wNTF2D with the required sparsity to model the time-varying variances of the sources in the s
... Show MoreHyperbole is an obvious and intentional exaggeration in the sense that it takes things to such an extreme that the audience goes too far and then pulls itself back to a more reasonable position, i.e. it is an extravagant statement or figure of speech not intended to be taken literally. This paper focuses on the formal and functional perspectives in the analysis of hyperbole which American candidates produce in their speeches in electoral campaigns, for it is hypothesized that candidates in their electoral campaigns use hyperbolic expressions excessively to persuade voters of the objectives of their electoral campaign programs. Hence, it aims to analyze hyperbole in context to determine the range of pragmatic func
... Show More
The article critically analyzes traditional translation models. The most influential models of translation in the second half of the 20th century have been mentioned, among which the theory of formal and dynamic equivalence, the theory of regular correspondences, informative, situational-denotative, functional-pragmatic theory of communication levels have been considered. The selected models have been analyzed from the point of view of the universality of their use for different types and types of translation, as well as the ability to comprehend the deep links established between the original and the translation.
Аннотация
Malaysia has been supported by one of the high-speed fiber internet connections called TM UniFi. TM UniFi is very familiar to be used as a medium to apply Small Office Home Office (SOHO) concept due to the COVID-19 pandemic. Most of the communication vendors offer varieties of network services to fulfill customers' needs and satisfaction during the pandemic. Quality of Services is queried by most users by the fact of increased on users from time to time. Therefore, it is crucial to know the network performance contrary to the number of devices connected to the TM UniFi network. The main objective of this research is to analyze TM UniFi performance with the impact of multiple device connections or users' services. The study was conducted
... Show More