The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
A novel robust finite time disturbance observer (RFTDO) based on an independent output-finite time composite control (FTCC) scheme is proposed for an air conditioning-system temperature and humidity regulation. The variable air volume (VAV) of the system is represented by two first-order mathematical models for the temperature and humidity dynamics. In the temperature loop dynamics, a RFTDO temperature (RFTDO-T) and an FTCC temperature (FTCC-T) are designed to estimate and reject the lumped disturbances of the temperature subsystem. In the humidity loop, a robust output of the FTCC humidity (FTCC-H) and RFTDO humidity (RFTDO-H) are also designed to estimate and reject the lumped disturbances of the humidity subsystem. Based on Lyapunov theo
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreAtrial fibrillation is associates with elevated risk of stroke. The simplest stroke risk assessment schemes are CHADS2 and CHA2DS2-VASc score. Aspirin and oral anticoagulants are recommended for stroke prevention in such patients.
The aim of this study was to assess status of CHADS2 and CHA2DS2-VASc scores in Iraqi atrial fibrillation patients and to report current status of stroke prevention in these patients with either warfarin or aspirin in relation to these scores.
This prospective cross-sectional study was carried out at Tikrit, Samarra, Sharqat, Baquba, and AL-Numaan hospitals from July 2017 to October 2017. CHADS2
... Show MoreData scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for
In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreTwo dimensional meso-scale concrete modeling was used in finite element analysis of plain concrete beam subjected to bending. The plane stress 4-noded quadrilateral elements were utilized to model coarse aggregate, cement mortar. The effect of aggregate fraction distribution, and pores percent of the total area – resulting from air voids entrapped in concrete during placement on the behavior of plain concrete beam in flexural was detected. Aggregate size fractions were randomly distributed across the profile area of the beam. Extended Finite Element Method (XFEM) was employed to treat the discontinuities problems result from double phases of concrete and cracking that faced during the finite element analysis of concrete beam. Crac
... Show MoreBackground. Implant insertion in regions with poor bone quantity, such as the posterior maxilla, is potentially associated with an increased rate of implant failure. Calcium sulfate can be used as the coating material for commercially pure titanium (CpTi) and as the bone graft material around implants when bound to eggshell powder to enhance the bone quality and quantity of bone defect regions. This study performed a torque removal test to evaluate the effectiveness of eggshell powder as a bone substitute for filling bone defects around CpTi-coated implants coated with nanocrystalline calcium sulfate. Materials and Methods. Eighty screw implant designs were used in the tibiae of 20 white New Zealand rabbits. A total of uncoated 20 s
... Show MoreObjective: This study aimed to investigate the relationship between high blood pressure and
different variables, such as (weight, smoking, amount of salt and water taken daily, and number
of hours of natural sleep per person) for young people.
Methodology: The study was conducted on students at the student community of the Technical
Institute in Baquba and the University of Diyala during the period from September 2015
until June 2016. The patients ranged in age from 18-24 years. All data were collected through
a questionnaire that included the main reasons and periodic follow-up of the disease.
Results: The total number of samples was 450.The results showed that 33% of all samples
have high blood pressure. The rel