Because of the experience of the mixture problem of high correlation and the existence of linear MultiCollinearity between the explanatory variables, because of the constraint of the unit and the interactions between them in the model, which increases the existence of links between the explanatory variables and this is illustrated by the variance inflation vector (VIF), L-Pseudo component to reduce the bond between the components of the mixture.
To estimate the parameters of the mixture model, we used in our research the use of methods that increase bias and reduce variance, such as the Ridge Regression Method and the Least Absolute Shrinkage and Selection Operator (LASSO) method a
... Show MoreZygapophyseal joints (or facet joints), are a plane synovial joint which located between the articular facet processes of the vertebral arch which is freely guided movable joints. Ten dried vertebrae were used for the lumbar region and taking (L4) as a sample to reveal stress pathways across the joints by using ANSYS program under different loading conditions which used Finite Elements Analysis model. Results obtained from the ANSYS program are important in understanding the boundary conditions for load analysis and the points of stress concentration which explained from the anatomical point of view and linked to muscle and ligament attachments. This model used as a computational tool to joint biomechanics and to prosthetic im
... Show MoreIn this work, a joint quadrature for numerical solution of the double integral is presented. This method is based on combining two rules of the same precision level to form a higher level of precision. Numerical results of the present method with a lower level of precision are presented and compared with those performed by the existing high-precision Gauss-Legendre five-point rule in two variables, which has the same functional evaluation. The efficiency of the proposed method is justified with numerical examples. From an application point of view, the determination of the center of gravity is a special consideration for the present scheme. Convergence analysis is demonstrated to validate the current method.
Multiple eliminations (de-multiple) are one of seismic processing steps to remove their effects and delineate the correct primary refractors. Using normal move out to flatten primaries is the way to eliminate multiples through transforming these data to frequency-wavenumber domain. The flatten primaries are aligned with zero axis of the frequency-wavenumber domain and any other reflection types (multiples and random noise) are distributed elsewhere. Dip-filter is applied to pass the aligned data and reject others will separate primaries from multiple after transforming the data back from frequency-wavenumber domain to time-distance domain. For that, a suggested name for this technique as normal move out- frequency-wavenumber domain
... Show MoreThere are many techniques for face recognition which compare the desired face image with a set of faces images stored in a database. Most of these techniques fail if faces images are exposed to high-density noise. Therefore, it is necessary to find a robust method to recognize the corrupted face image with a high density noise. In this work, face recognition algorithm was suggested by using the combination of de-noising filter and PCA. Many studies have shown that PCA has ability to solve the problem of noisy images and dimensionality reduction. However, in cases where faces images are exposed to high noise, the work of PCA in removing noise is useless, therefore adding a strong filter will help to im
... Show MoreThis research aimed to predict the permanent deformation (rutting) in conventional and rubberized asphalt mixes under repeated load conditions using the Finite Element Method (FEM). A three-dimensional (3D) model was developed to simulate the Wheel Track Testing (WTT) loading. The study was conducted using the Abaqus/Standard finite element software. The pavement slab was simulated using a nonlinear creep (time-hardening) model at 40°C. The responses of the viscoplastic model under the influence of the trapezoidal amplitude of moving wheel loadings were determined for different speeds and numbers of cycles. The results indicated that a wheel speed increase from 0.5Km/h to 1.0Km/h decreased the rut depth by about 22% and 24% in conv
... Show MoreThis study investigates the implementation of Taguchi design in the estimation of minimum corrosion rate of mild-steel in cooling tower that uses saline solution of different concentration. The experiments were set on the basis of Taguchi’s L16 orthogonal array. The runs were carried out under different condition such as inlet concentration of saline solution, temperature, and flowrate. The Signal-to- Noise ratio and ANOVA analysis were used to define the impact of cooling tower working conditions on the corrosion rate. A regression had been modelled and optimized to identify the optimum level for the working parameters that had been founded to be 13%NaCl, 35ᴼC, and 1 l/min. Also a confirmation run to establish the p
... Show MoreThe objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreA group of acceptance sampling to testing the products was designed when the life time of an item follows a log-logistics distribution. The minimum number of groups (k) required for a given group size and acceptance number is determined when various values of Consumer’s Risk and test termination time are specified. All the results about these sampling plan and probability of acceptance were explained with tables.