The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
Data centric techniques, like data aggregation via modified algorithm based on fuzzy clustering algorithm with voronoi diagram which is called modified Voronoi Fuzzy Clustering Algorithm (VFCA) is presented in this paper. In the modified algorithm, the sensed area divided into number of voronoi cells by applying voronoi diagram, these cells are clustered by a fuzzy C-means method (FCM) to reduce the transmission distance. Then an appropriate cluster head (CH) for each cluster is elected. Three parameters are used for this election process, the energy, distance between CH and its neighbor sensors and packet loss values. Furthermore, data aggregation is employed in each CH to reduce the amount of data transmission which le
... Show MoreIn this paper has been building a statistical model of the Saudi financial market using GARCH models that take into account Volatility in prices during periods of circulation, were also study the effect of the type of random error distribution of the time series on the accuracy of the statistical model, as it were studied two types of statistical distributions are normal distribution and the T distribution. and found by application of a measured data that the best model for the Saudi market is GARCH (1,1) model when the random error distributed t. student's .
This paper tackles with principal component analysis method (PCA ) to dimensionality reduction in the case of linear combinations to digital image processing and analysis. The PCA is statistical technique that shrinkages a multivariate data set consisting of inter-correlated variables into a data set consisting of variables that are uncorrelated linear combination, while ensuring the least possible loss of useful information. This method was applied to a group of satellite images of a certain area in the province of Basra, which represents the mouth of the Tigris and Euphrates rivers in the Shatt al-Arab in the province of Basra.
... Show MoreThe purchase of a home and access to housing is one of the most important requirements for the life of the individual and the stability of living and the development of the prices of houses in general and in Baghdad in particular affected by several factors, including the basic area of the house, the age of the house, the neighborhood in which the housing is available and the basic services, Where the statistical model SSM model was used to model house prices over a period of time from 2000 to 2018 and forecast until 2025 The research is concerned with enhancing the importance of this model and describing it as a standard and important compared to the models used in the analysis of time series after obtaining the
... Show MoreAbstract :
The Aims of this research is to describe the concept of risk, its type and method of measurement, and to clarify the impact of these risks on the expected cash flow statement and the preparation of the target cash flow statement that takes these risks into consideration. Because the local economic environment is exposed to many risks, Therefore, this list will be predictive, which will help the economic unit to make administrative decisions, especially decisions related to operational, investment and financing activities. Therefore, the research problem is based on the fact that most of the local economic units are the list of flows According to the actual basis and not according to the discretionary basis (bud
... Show MoreThe research aims to measure and analyze the reality of liquidity in the Rasheed Bank and determine their impact on risk and return in order to identify the extent of the efficiency of the management of liquidity by the Bank and how to employ them in a profitable investment areas, and analysis of the compatibility of the liquidity gap and gap the balance sheet (sensitive interest rate) and affected net interest Change prices, and through the adoption of style ladder recommended Meritassets and liabilities by the Central Bank of Iraq, as it is an important and vital aspects in commercial banks' management, when there is a commonly used optimizing the resources of the bank available, it means that there is a banking efficient management is
... Show MoreThe disposal of textile effluents to the surface water bodies represents the critical issue especially these effluents can have negative impacts on such bodies due to the presence of dyes in their composition. Biological remediation methods like constructed wetlands are more cost-effective and environmental friendly technique in comparison with traditional methods. The ability of vertical subsurface flow constructed wetlands units for treating of simulated wastewater polluted with Congo red dye has been studied in this work. The units were packed with waterworks sludge bed that either be unplanted or planted with Phragmites australis and Typha domingensis. The efficacy of present units was evaluated by monitoring of DO, Temperature, COD
... Show MoreIraq suffers from lack of water resources supply because the headwaters of the rivers located outside its borders and the influence of upstream countries on the quantities of flowing water, in addition to the increase of pressure on available water as a result of population increase and not adopting the principle of rationalization where misuse and wastage and lack of strategic vision to treat and manage water use in accordance with the economic implications fall. This is reflected fallout on water security and subsequently on national and food security, while the issue of using water resources is development top priority in different countries in the world because of the importance of water effect on the security of indivi
... Show MoreThe 3D electro-Fenton technique is, due to its high efficiency, one of the technologies suggested to eliminate organic pollutants in wastewater. The type of particle electrode used in the 3D electro-Fenton process is one of the most crucial variables because of its effect on the formation of reactive species and the source of iron ions. The electrolytic cell in the current study consisted of graphite as an anode, carbon fiber (CF) modified with graphene as a cathode, and iron foam particles as a third electrode. A response surface methodology (RSM) approach was used to optimize the 3D electro-Fenton process. The RSM results revealed that the quadratic model has a high R2 of 99.05 %. At 4 g L-1 iron foam particles, time of 5 h, and
... Show More