The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
Sequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of
... Show MoreThe smart city concept has attracted high research attention in recent years within diverse application domains, such as crime suspect identification, border security, transportation, aerospace, and so on. Specific focus has been on increased automation using data driven approaches, while leveraging remote sensing and real-time streaming of heterogenous data from various resources, including unmanned aerial vehicles, surveillance cameras, and low-earth-orbit satellites. One of the core challenges in exploitation of such high temporal data streams, specifically videos, is the trade-off between the quality of video streaming and limited transmission bandwidth. An optimal compromise is needed between video quality and subsequently, rec
... Show MoreSequence covering array (SCA) generation is an active research area in recent years. Unlike the sequence-less covering arrays (CA), the order of sequence varies in the test case generation process. This paper reviews the state-of-the-art of the SCA strategies, earlier works reported that finding a minimal size of a test suite is considered as an NP-Hard problem. In addition, most of the existing strategies for SCA generation have a high order of complexity due to the generation of all combinatorial interactions by adopting one-test-at-a-time fashion. Reducing the complexity by adopting one-parameter- at-a-time for SCA generation is a challenging process. In addition, this reduction facilitates the supporting for a higher strength of cove
... Show MoreThe phenomena of Dust storm take place in barren and dry regions all over the world. It may cause by intense ground winds which excite the dust and sand from soft, arid land surfaces resulting it to rise up in the air. These phenomena may cause harmful influences upon health, climate, infrastructure, and transportation. GIS and remote sensing have played a key role in studying dust detection. This study was conducted in Iraq with the objective of validating dust detection. These techniques have been used to derive dust indices using Normalized Difference Dust Index (NDDI) and Middle East Dust Index (MEDI), which are based on images from MODIS and in-situ observation based on hourly wi
Developing an efficient algorithm for automated Magnetic Resonance Imaging (MRI) segmentation to characterize tumor abnormalities in an accurate and reproducible manner is ever demanding. This paper presents an overview of the recent development and challenges of the energy minimizing active contour segmentation model called snake for the MRI. This model is successfully used in contour detection for object recognition, computer vision and graphics as well as biomedical image processing including X-ray, MRI and Ultrasound images. Snakes being deformable well-defined curves in the image domain can move under the influence of internal forces and external forces are subsequently derived from the image data. We underscore a critical appraisal
... Show MoreIn this article, the lattice Boltzmann method with two relaxation time (TRT) for the D2Q9 model is used to investigate numerical results for 2D flow. The problem is performed to show the dissipation of the kinetic energy rate and its relationship with the enstrophy growth for 2D dipole wall collision. The investigation is carried out for normal collision and oblique incidents at an angle of . We prove the accuracy of moment -based boundary conditions with slip and Navier-Maxwell slip conditions to simulate this flow. These conditions are under the effect of Burnett-order stress conditions that are consistent with the discrete Boltzmann equation. Stable results are found by using this kind of boundary condition where d
... Show MoreThe Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreDue to the lack of statistical researches in studying with existing (p) of Exogenous Input variables, and there contributed in time series phenomenon as a cause, yielding (q) of Output variables as a result in time series field, to form conceptual idea similar to the Classical Linear Regression that studies the relationship between dependent variable with explanatory variables. So highlight the importance of providing such research to a full analysis of this kind of phenomena important in consumer price inflation in Iraq. Were taken several variables influence and with a direct connection to the phenomenon and analyzed after treating the problem of outliers existence in the observations by (EM) approach, and expand the sample size (n=36) to
... Show MoreAdvanced strategies for production forecasting, operational optimization, and decision-making enhancement have been employed through reservoir management and machine learning (ML) techniques. A hybrid model is established to predict future gas output in a gas reservoir through historical production data, including reservoir pressure, cumulative gas production, and cumulative water production for 67 months. The procedure starts with data preprocessing and applies seasonal exponential smoothing (SES) to capture seasonality and trends in production data, while an Artificial Neural Network (ANN) captures complicated spatiotemporal connections. The history replication in the models is quantified for accuracy through metric keys such as m
... Show More