In order to obtain a mixed model with high significance and accurate alertness, it is necessary to search for the method that performs the task of selecting the most important variables to be included in the model, especially when the data under study suffers from the problem of multicollinearity as well as the problem of high dimensions. The research aims to compare some methods of choosing the explanatory variables and the estimation of the parameters of the regression model, which are Bayesian Ridge Regression (unbiased) and the adaptive Lasso regression model, using simulation. MSE was used to compare the methods.
The logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MorePurpose: The concept of complete street is one of the modern trends concerned with diversifying means of transportation and reducing the disadvantages of mechanical transportation modes. This paper discusses the role of complete streets can play in developing the urban environment in the Alyarmok District of Baghdad. Method/design/approach: The linear regression method used to analyze the opinions of 100 respondents surveyed in the study area in order to find the relationship between the urban environment and the complete street elements. Theoretical framework: The Modern trends in urban planning aim to find alternatives to the policies of traditional transportation planning that focus on vehicular mobi
... Show MoreThe ability to produce load-bearing masonry units adopting ACI 211.1 mix design using (1:3.2:2.5) as (cement: fine aggregate: coarse aggregate) with slump range (25-50mm) which can conform (dimension, absorption, and compressive strength) within IQS 1077/1987 requirements type A was our main goal of the study. The ability to use low cement content (300 kg/m3) to handle our market price products since the most consumption in wall construction for low-cost buildings was encouraging. The use of (10 and 20%) of LECA as partial volume replacement of coarse aggregate to reduce the huge weight of masonry blocks can also be recommended. The types of production of the load-bearing masonry units were A and B for (
... Show MoreText categorization refers to the process of grouping text or documents into classes or categories according to their content. Text categorization process consists of three phases which are: preprocessing, feature extraction and classification. In comparison to the English language, just few studies have been done to categorize and classify the Arabic language. For a variety of applications, such as text classification and clustering, Arabic text representation is a difficult task because Arabic language is noted for its richness, diversity, and complicated morphology. This paper presents a comprehensive analysis and a comparison for researchers in the last five years based on the dataset, year, algorithms and the accuracy th
... Show MoreIn this paper reliable computational methods (RCMs) based on the monomial stan-dard polynomials have been executed to solve the problem of Jeffery-Hamel flow (JHF). In addition, convenient base functions, namely Bernoulli, Euler and Laguerre polynomials, have been used to enhance the reliability of the computational methods. Using such functions turns the problem into a set of solvable nonlinear algebraic system that MathematicaⓇ12 can solve. The JHF problem has been solved with the help of Improved Reliable Computational Methods (I-RCMs), and a review of the methods has been given. Also, published facts are used to make comparisons. As further evidence of the accuracy and dependability of the proposed methods, the maximum error remainder
... Show MoreThe sensitive and important data are increased in the last decades rapidly, since the tremendous updating of networking infrastructure and communications. to secure this data becomes necessary with increasing volume of it, to satisfy securing for data, using different cipher techniques and methods to ensure goals of security that are integrity, confidentiality, and availability. This paper presented a proposed hybrid text cryptography method to encrypt a sensitive data by using different encryption algorithms such as: Caesar, Vigenère, Affine, and multiplicative. Using this hybrid text cryptography method aims to make the encryption process more secure and effective. The hybrid text cryptography method depends on circular queue. Using circ
... Show MoreSome degree of noise is always present in any electronic device that
transmits or receives a signal . For televisions, this signal i has been to s the
broadcast data transmitted over cable-or received at the antenna; for digital
cameras, the signal is the light which hits the camera sensor. At any case, noise
is unavoidable. In this paper, an electronic noise has been generate on
TV-satellite images by using variable resistors connected to the transmitting cable
. The contrast of edges has been determined. This method has been applied by
capturing images from TV-satellite images (Al-arabiya channel) channel with
different resistors. The results show that when increasing resistance always
produced higher noise f
In this paper, we will discuss the performance of Bayesian computational approaches for estimating the parameters of a Logistic Regression model. Markov Chain Monte Carlo (MCMC) algorithms was the base estimation procedure. We present two algorithms: Random Walk Metropolis (RWM) and Hamiltonian Monte Carlo (HMC). We also applied these approaches to a real data set.
In this work, results from an optical technique (laser speckle technique) for measuring surface roughness was done by using statistical properties of speckle pattern from the point of view of computer image texture analysis. Four calibration relationships were used to cover wide range of measurement with the same laser speckle technique. The first one is based on intensity contrast of the speckle, the second is based on analysis of speckle binary image, the third is on size of speckle pattern spot, and the latest one is based on characterization of the energy feature of the gray level co-occurrence matrices for the speckle pattern. By these calibration relationships surface roughness of an object surface can be evaluated within the
... Show More