In this research, we studied the multiple linear regression models for two variables in the presence of the autocorrelation problem for the error term observations and when the error is distributed with general logistic distribution. The auto regression model is involved in the studying and analyzing of the relationship between the variables, and through this relationship, the forecasting is completed with the variables as values. A simulation technique is used for comparison methods depending on the mean square error criteria in where the estimation methods that were used are (Generalized Least Squares, M Robust, and Laplace), and for different sizes of samples (20, 40, 60, 80, 100, 120). The M robust method is demonstrated the best method for all values of correlation coefficients as (ϕ = -0.9, -0.5, 0.5, 0.9). So, we applied it to the data that was obtained from the Ministry of Planning in Iraq / Central Organization for Statistics, which represents the consumer price index for the years 2004-2016. So, we confirmed that the dollar exchange rate is directly affected by the increase in annual inflation rates and the ratio of currency to the money supply.
The main objective of this paper is to study the behavior of Non-Prismatic Reinforced Concrete (NPRC) beams with and without rectangular openings either when exposed to fire or not. The experimental program involves casting and testing 9 NPRC beams divided into 3 main groups. These groups were categorized according to heating temperature (ambient temperature, 400°C, and 700°C), with each group containing 3 NPRC beams (solid beams and beams with 6 and 8 trapezoidal openings). For beams with similar geometry, increasing the burning temperature results in their deterioration as reflected in their increasing mid-span deflection throughout the fire exposure period and their residual deflection after cooling. Meanwhile, the existing ope
... Show MoreThe main objective of this paper is to study the behavior of Non-Prismatic Reinforced Concrete (NPRC) beams with and without rectangular openings either when exposed to fire or not. The experimental program involves casting and testing 9 NPRC beams divided into 3 main groups. These groups were categorized according to heating temperature (ambient temperature, 400°C, and 700°C), with each group containing 3 NPRC beams (solid beams and beams with 6 and 8 trapezoidal openings). For beams with similar geometry, increasing the burning temperature results in their deterioration as reflected in their increasing mid-span deflection throughout the fire exposure period and their residual deflection after cooling. Meanwhile, the existing ope
... Show MoreLinear programming currently occupies a prominent position in various fields and has wide applications, as its importance lies in being a means of studying the behavior of a large number of systems as well. It is also the simplest and easiest type of models that can be created to address industrial, commercial, military and other dilemmas. Through which to obtain the optimal quantitative value. In this research, we dealt with the post optimality solution, or what is known as sensitivity analysis, using the principle of shadow prices. The scientific solution to any problem is not a complete solution once the optimal solution is reached. Any change in the values of the model constants or what is known as the inputs of the model that will chan
... Show MoreThis research is carried out to investigate the behavior of self-compacting concrete (SCC) two-way slabs with central square opening under uniformly distributed loads. The experimental part of this research is based on casting and testing six SCC simply supported square slabs having the same dimentions and reinforcement. One of these slabs was cast without opening as a control slab. While, the other five slabs having opening ratios (OR) of 2.78%, 6.25%, 11.11%, 17.36% and 25.00%. From the experimental results it is found that the maximum percentage decrease in cracking and ultimate uniform loads were 31.82% and 12.17% compared to control slab for opening ratios (OR
... Show MoreError control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high
... Show MoreTV drama has gone through many developmental stages until it reached and settled in the form of TV series of thirty episodes. Alongside the development of the TV technologies and the widespread of satellite channels, the form of the TV drama has changed and the series has consisted of two parts and then parts multiplied until they amounted to ten parts and more. This form of TV drama has become an artistic phenomenon, that once the series is displayed on one of the channels and achieves a noticeable success, its producers work to produce a second part of that series and so on. This form of TV drama has remained away from being researched.
This has urged the researcher to accomplish this study entitled (TV series of multiple parts fro
The undetected error probability is an important measure to assess the communication reliability provided by any error coding scheme. Two error coding schemes namely, Joint crosstalk avoidance and Triple Error Correction (JTEC) and JTEC with Simultaneous Quadruple Error Detection (JTEC-SQED), provide both crosstalk reduction and multi-bit error correction/detection features. The available undetected error probability model yields an upper bound value which does not give accurate estimation on the reliability provided. This paper presents an improved mathematical model to estimate the undetected error probability of these two joint coding schemes. According to the decoding algorithm the errors are classified into patterns and their decoding
... Show More