The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
Shatt Al-Hilla was considered one of the important branches of Euphrates River that supplies irrigation water to millions of dunams of planted areas. It is important to control the velocity and water level along the river to maintain the required level for easily diverting water to the branches located along the river. So, in this research, a numerical model was developed to simulate the gradually varied unsteady flow in Shatt AL-Hilla. The present study aims to solve the continuity and momentum (Saint-Venant) equations numerically to predict the hydraulic characteristics in the river using Galerkin finite element method. A computer program was designed and built using the programming language FORTRAN-77. Fifty kilometers was consid
... Show MoreThe Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MorePoverty phenomenon is very substantial topic that determines the future of societies and governments and the way that they deals with education, health and economy. Sometimes poverty takes multidimensional trends through education and health. The research aims at studying multidimensional poverty in Iraq by using panelized regression methods, to analyze Big Data sets from demographical surveys collected by the Central Statistical Organization in Iraq. We choose classical penalized regression method represented by The Ridge Regression, Moreover; we choose another penalized method which is the Smooth Integration of Counting and Absolute Deviation (SICA) to analyze Big Data sets related to the different poverty forms in Iraq. Euclidian Distanc
... Show MoreIn the light of the globalization Which surrounds the business environment and whose impact has been reflected on industrial economic units the whole world has become a single market that affects its variables on all units and is affected by the economic contribution of each economic unit as much as its share. The problem of this research is that the use of Pareto analysis enables industrial economic units to diagnose the risks surrounding them , so the main objective of the research was to classify risks into both internal and external types and identify any risks that require more attention.
The research was based on the hypothesis that Pareto analysis used, risks can be identified and addressed before they occur.
... Show MoreThe current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition
... Show MoreIn this paper, the finite element method is used to study the dynamic behavior of the damaged rotating composite blade. Three dimensional, finite element programs were developed using a nine node laminated shell as a discretization element for the blade structure (the same element type is used for damaged and non-damaged structure). In this analysis the initial stress effect (geometric stiffness) and other rotational effects except the carioles acceleration effect are included. The investigation covers the effect speed of rotation, aspect ratio, skew angle, pre-twist angle, radius to length, layer lamination and fiber orientation of composite blade. After modeling a non-damaged rotating composite blade, the work procedure was to ap
... Show MoreThe issue of increasing the range covered by a wireless sensor network with restricted sensors is addressed utilizing improved CS employing the PSO algorithm and opposition-based learning (ICS-PSO-OBL). At first, the iteration is carried out by updating the old solution dimension by dimension to achieve independent updating across the dimensions in the high-dimensional optimization problem. The PSO operator is then incorporated to lessen the preference random walk stage's imbalance between exploration and exploitation ability. Exceptional individuals are selected from the population using OBL to boost the chance of finding the optimal solution based on the fitness value. The ICS-PSO-OBL is used to maximize coverage in WSN by converting r
... Show MoreThe purpose of this work is to concurrently estimate the UVvisible spectra of binary combinations of piroxicam and mefenamic acid using the chemometric approach. To create the model, spectral data from 73 samples (with wavelengths between 200 and 400 nm) were employed. A two-layer artificial neural network model was created, with two neurons in the output layer and fourteen neurons in the hidden layer. The model was trained to simulate the concentrations and spectra of piroxicam and mefenamic acid. For piroxicam and mefenamic acid, respectively, the Levenberg-Marquardt algorithm with feed-forward back-propagation learning produced root mean square errors of prediction of 0.1679 μg/mL and 0.1154 μg/mL, with coefficients of determination of
... Show More 
        