The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
In this work, we first construct Hermite wavelets on the interval [0,1) with it’s product, Operational matrix of integration 2^k M×2^k M is derived, and used it for solving nonlinear Variational problems with reduced it to a system of algebric equations and aid of direct method. Finally, some examples are given to illustrate the efficiency and performance of presented method.
Ziegler and Nichols proposed the well-known Ziegler-Nichols method to tune the coefficients of PID controller. This tuning method is simple and gives fixed values for the coefficients which make PID controller have weak adaptabilities for the model parameters variation and changing in operating conditions. In order to achieve adaptive controller, the Neural Network (NN) self-tuning PID control is proposed in this paper which combines conventional PID controller and Neural Network learning capabilities. The proportional, integral and derivative (KP, KI, KD) gains are self tuned on-line by the NN output which is obtained due to the error value on the desired output of the system under control. The conventio
... Show MoreProduction of fatty acid esters (biodiesel) from oleic acid and 2-ethylhexanol using sulfated zirconia as solid catalyst for the production of biodiesel was investigated in this work.
The parameters studied were temperature of reaction (100 to 130°C), molar ratio of alcohol to free fatty acid (1:1 to 3:1), concentration of catalyst (0.5 to 3%wt), mixing speed (500 to 900 rpm) and types of sulfated zirconia (i.e modified, commercial, prepared catalyst according to literature and reused catalyst). The results show the best conversion to biodiesel was 97.74% at conditions of 130°C, 3:1, 2wt% and 650 rpm using modified catalyst respectively. Also, modified c
... Show MoreThe cinematographer mediates through the means of cinema and television a set of elements complementing each other in the light of developments in various sciences, culture and arts for the purpose of conveying the meaning to the recipient and achieve aesthetic taste. Despite the diversity of cinematographic media with its multiple forms, The researcher started from the principle of definition and knowledge of a technical phenomenon that emerged in the cinematographic medium through the treatment of dramatic events through the solutions of the exit line depends on the narrative of events in one place contributes to attract Mam spectator since this interesting phenomenon in the mediator, there .van question arises the adoption of that vis
... Show MoreGraphite nanoparticles were successfully synthesized using mixture of H2O2/NH4OH with three steps of oxidation. The process of oxidations were analysis by XRD and optics microscopic images which shows clear change in particle size of graphite after every steps of oxidation. The method depend on treatments the graphite with H2O2 in two steps than complete the last steps by reacting with H2O2/NH4OH with equal quantities. The process did not reduces the several sheets for graphite but dispersion the aggregates of multi-sheets carbon when removed the Van Der Waals forces through the oxidation process.
The present study aims at assessing the status of heavy metals such as nickel, cadmium and lead to pollute some areas of Baghdad city. In this study the spectral absorption device and the program ArcGIS 10.2 will using. The soil samples were taken from five different locations in Baghdad, including Ameriya, Kadhimiya, Palestine Street, Jadiriyah and Taji for the 5cm depth layer on both sides of the road. This work on soil samples has been completed in two :phases 1 - Preparation of samples: For the purpose of converting solid material into a extract containing elements in the form of single ions can be estimated by the device 2-Determination of elements: Samples prepared to the device
Today, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show MoreMalicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show More