In this research, we studied the multiple linear regression models for two variables in the presence of the autocorrelation problem for the error term observations and when the error is distributed with general logistic distribution. The auto regression model is involved in the studying and analyzing of the relationship between the variables, and through this relationship, the forecasting is completed with the variables as values. A simulation technique is used for comparison methods depending on the mean square error criteria in where the estimation methods that were used are (Generalized Least Squares, M Robust, and Laplace), and for different sizes of samples (20, 40, 60, 80, 100, 120). The M robust method is demonstrated the best method for all values of correlation coefficients as (ϕ = -0.9, -0.5, 0.5, 0.9). So, we applied it to the data that was obtained from the Ministry of Planning in Iraq / Central Organization for Statistics, which represents the consumer price index for the years 2004-2016. So, we confirmed that the dollar exchange rate is directly affected by the increase in annual inflation rates and the ratio of currency to the money supply.
Cox regression model have been used to estimate proportion hazard model for patients with hepatitis disease recorded in Gastrointestinal and Hepatic diseases Hospital in Iraq for (2002 -2005). Data consists of (age, gender, survival time terminal stat). A Kaplan-Meier method has been applied to estimate survival function and hazerd function.
In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreLacing reinforcement plays a critical role in the design and performance of reinforced concrete (RC) slabs by distributing the applied loads more evenly across the slab, ensuring that no specific area of the slab is overloaded. In this study, nine slabs, divided into three groups according to the investigated parameters, were meticulously designed and evaluated to study the interplay between the lacing reinforcement and other key parameters. Each slab was crafted for simple support and was subjected to both static and repeated two-point load tests. The lacing reinforcement had an angle of 45° with various tension and lacing steel. The repeated-tested specimens with lacing reinforcement experienced smaller ductility than those of s
... Show MoreLacing reinforcement plays a critical role in the design and performance of reinforced concrete (RC) slabs by distributing the applied loads more evenly across the slab, ensuring that no specific area of the slab is overloaded. In this study, nine slabs, divided into three groups according to the investigated parameters, were meticulously designed and evaluated to study the interplay between the lacing reinforcement and other key parameters. Each slab was crafted for simple support and was subjected to both static and repeated two-point load tests. The lacing reinforcement had an angle of 45° with various tension and lacing steel. The repeated-tested specimens with lacing reinforcement experienced smaller ductility than those of s
... Show MoreTeen-Computer Interaction (TeenCI) stands in an infant phase and emerging in positive path. Compared to Human-Computer Interaction (generally dedicated to adult) and Child-Computer Interaction, TeenCI gets less interest in terms of research efforts and publications. This has revealed extensive prospects for researchers to explore and contribute in the region of computer design and evaluation for teen, in specific. As a subclass of HCI and a complementary for CCI, TeenCI that tolerates teen group, should be taken significant concern in the sense of its context, nature, development, characteristics and architecture. This paper tends to discover teen’s emotion contribution as the first attempt towards building a conceptual model for TeenC
... Show MoreThe information revolution، the new language has become one for all the peoples of the world through handling and exchange and to participate in all key areas (economic، cultural and scientific) and Accounting episode of this revolution has turned most of the traditional systems (manual) in companies to automated systems، this transformation in the regulations summoned from the auditors that develops their traditional examination automated systems so had to provide tools for auditing help auditors to keep abreast of developments and as a result there is no evidence checksum Local Private audited automated systems came search to provide evidence helps auditors for guidance as part of COBIT، which provides audit procedures Detailed inf
... Show MoreStenography is the art of hiding the very presence of communication by embedding secret message into innocuous looking cover document, such as digital image, videos, sound files, and other computer files that contain perceptually irrelevant or redundant information as covers or carriers to hide secret messages.
In this paper, a new Least Significant Bit (LSB) nonsequential embedding technique in wave audio files is introduced. To support the immunity of proposed hiding system, and in order to recover some weak aspect inherent with the pure implementation of stego-systems, some auxiliary processes were suggested and investigated including the use of hidden text jumping process and stream ciphering algorithm. Besides, the suggested
... Show MoreEnhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show MoreMultilevel models are among the most important models widely used in the application and analysis of data that are characterized by the fact that observations take a hierarchical form, In our research we examined the multilevel logistic regression model (intercept random and slope random model) , here the importance of the research highlights that the usual regression models calculate the total variance of the model and its inability to read variance and variations between levels ,however in the case of multi-level regression models, the calculation of the total variance is inaccurate and therefore these models calculate the variations for each level of the model, Where the research aims to estimate the parameters of this m
... Show MoreIn the present work, the magnetic dipole and electric quadrupole moments for some sodium isotopes have been calculated using the shell model, considering the effect of the two-body effective interactions and the single-particle potentials. These isotopes are; 21Na (3/2+), 23Na (3/2+), 25Na (5/2+), 26Na (3+), 27Na (5/2+), 28Na (1+) and, 29Na (3/2+). The one-body transition density matrix elements (OBDM) have been calculated using the (USDA, USDB, HBUMSD and W) two-body effective interactions carried out in the sd-shell model space. The sd shell model space consists of the active 2s1/2, 1d5/2,
... Show More