The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
In the current paradigms of information technology, cloud computing is the most essential kind of computer service. It satisfies the need for high-volume customers, flexible computing capabilities for a range of applications like as database archiving and business analytics, and the requirement for extra computer resources to provide a financial value for cloud providers. The purpose of this investigation is to assess the viability of doing data audits remotely inside a cloud computing setting. There includes discussion of the theory behind cloud computing and distributed storage systems, as well as the method of remote data auditing. In this research, it is mentioned to safeguard the data that is outsourced and stored in cloud serv
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Business organizations have faced many challenges in recent times, most important of which is information technology, because it is widely spread and easy to use. Its use has led to an increase in the amount of data that business organizations deal with an unprecedented manner. The amount of data available through the internet is a problem that many parties seek to find solutions for. Why is it available there in this huge amount randomly? Many expectations have revealed that in 2017, there will be devices connected to the internet estimated at three times the population of the Earth, and in 2015 more than one and a half billion gigabytes of data was transferred every minute globally. Thus, the so-called data mining emerged as a
... Show MoreAcceptable Bit Error rate can be maintained by adapting some of the design parameters such as modulation, symbol rate, constellation size, and transmit power according to the channel state.
An estimate of HF propagation effects can be used to design an adaptive data transmission system over HF link. The proposed system combines the well known Automatic Link Establishment (ALE) together with variable rate transmission system. The standard ALE is modified to suite the required goal of selecting the best carrier frequency (channel) for a given transmission. This is based on measuring SINAD (Signal plus Noise plus Distortion to Noise plus Distortion), RSL (Received Signal Level), multipath phase distortion and BER (Bit Error Rate) fo
... Show MoreObjective Using two complementary techniques of virus detection human papillomavirus (HPV)[capture of hybrids (CH) and polymerase chain reaction (PCR)], relate the cytological study and/or cervical biopsy with high-risk HPV (HPV-HR) genotypes presence, as well as relating their viral load (VL). Methods About 272 women, who presented most cell alterations compatible with lesions cervical HPV, which has been detected in all high risk by the CH method and HPV genotype detection by PCR. Results In 22% of the patients it was not detected HPV DNA. Genotype 16 and/or 18 was prevalent and was found in 33% of the 212 women studied, meanwhile, mixed infections were found by several genotypes in 25%. In as for the histological lesions found, in 61 pat
... Show MoreObjective: The study aim to evaluate secondary schools students' exposure to risk factors in Al-Najaf City. Methodology: Descriptive study conducted in Al- Najaf City/Iraq on students at secondary schools, those aged (12-24) years old, for the period from the 13ed of November 2015 and up to 4ed of August 2015. The sample included secondary school from those schools . Data is collected through a constructed questionnaire, reliability and students (intermediate and secondary) (540) student; (270) male and (270) females who are selected randomly content validity process has been determined for the instrument. Dat
Background: Appendectomy is still one of the most commonly performed emergency surgical procedures worldwide.Avoiding delays in the diagnosis in these patients may play a role in reducing observed morbidity.Aim of study:To analyze the clinico-pathological profile and outcomes of patients undergoing emergency appendectomies to determine risk factors influencingcomplicaions.Type of the study: A prospective analytic studyPatients and Methods: The study involves 108 patients underwent emergency appendectomies at Al-kindy teaching hospital from April 2014 to March 2015. Appendicitis was categorized into two groups perforated andnonperforatedappendicities. A comparison between them was made in regard to Gender, Age, clinical presentation, inve
... Show MoreManagement of Foreign Exchange Rate Exposure by Using Financial Hedging An Analytical Empirical Study The main purpose of this Research is to investigate the ability to reduce the effect of exchange rate fluctuation on firm value , by usage appropriate hedging strategies to provide the firms force to adopted with complex and highly uncertainty conditions , characteristic of the most of the financial markets . The field of this study is the giant five Multinational on the world. Nokia, Toyota Motor, Intel , Coca Cola, Microsoft. practical analysis is provide the truth of all study's hypothesis , and it is reach to many of conclusion, the most important of them is Stem from unexpected fluctuation on nominal ex
... Show MoreBackground: obesity is nowadays a pandemic condition. Obese subjects are commonly characterized by musculoskeletal disorders and particularly by non-specific low back pain (LBP). However, the relationship between obesity and LBP remain to date unsupported by an objective measurement of the mechanical behavior of spine and it is morphology in obese subjects. . Objectives: To identify the relationship between obesity and LBP regarding (height, weight, sleeping, chronic diseases, smoking, and steroid). Method: A cross-sectional study was conducted from the first of January 2016 to January 2018 in obe
... Show More