In this paper, we will discuss the performance of Bayesian computational approaches for estimating the parameters of a Logistic Regression model. Markov Chain Monte Carlo (MCMC) algorithms was the base estimation procedure. We present two algorithms: Random Walk Metropolis (RWM) and Hamiltonian Monte Carlo (HMC). We also applied these approaches to a real data set.
This study represents an attempt to develop a model that demonstrates the relationship between HRM Practices, Governmental Support and Organizational performance of small businesses. Furthermore, this study assay to unfold the socalled “Black Box” to clarify the ambiguous relationship between HRM practices and organizational performance by considering the pathway of logical sequence influence. The model of this study consists two parts, the first part devoted to examining the causal relationships among HRM practices, employees’ outcomes, and organizational performance. The second part assesses the direct relationship between the governmental support and organizational performance. It is hypothesized that HRM practices positively influ
... Show MoreTraumatic Brain Injury (TBI) is still considered a worldwide leading cause of mortality and morbidity. Within the last decades, different modalities were used to assess severity and outcome including Glasgow Coma Scale (GCS), imaging modalities, and even genetic polymorphism, however, determining the prognosis of TBI victims is still challenging requiring the emerging of more accurate and more applicable tools to surrogate other old modalities
Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreThe purpose of this study is discuss the effect of Corporate Governance in the Tax Planning, has been made in a sample of Iraqi Industrial contribution Companies listed in Iraqi Stock Exchange Market (ISE) , for the period from 2008 to 2012.The study used the" Experimental Research Approach" . Also used the (Modified Jones Model, 1995) in order to measure the corporate governance, to measure the extent of the practice of corporate governance in the samples companies. While it use to measure tax planning, the model that used by studies and researches of tax that adopted in discussions of tax reform, by analyzing the financial statements of companies to reach a measurement for the two variables of the study. T
... Show MoreAPDBN Rashid, International Journal of Humanities and Social Sciences/ RIMAK, 2023
Statisticians often use regression models like parametric, nonparametric, and semi-parametric models to represent economic and social phenomena. These models explain the relationships between different variables in these phenomena. One of the parametric model techniques is conic projection regression. It helps to find the most important slopes for multidimensional data using prior information about the regression's parameters to estimate the most efficient estimator. R algorithms, written in the R language, simplify this complex method. These algorithms are based on quadratic programming, which makes the estimations more accurate.
Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreA two time step stochastic multi-variables multi-sites hydrological data forecasting model was developed and verified using a case study. The philosophy of this model is to use the cross-variables correlations, cross-sites correlations and the two steps time lag correlations simultaneously, for estimating the parameters of the model which then are modified using the mutation process of the genetic algorithm optimization model. The objective function that to be minimized is the Akiake test value. The case study is of four variables and three sites. The variables are the monthly air temperature, humidity, precipitation, and evaporation; the sites are Sulaimania, Chwarta, and Penjwin, which are located north Iraq. The model performance was
... Show More