The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
In this research, a simple experiment in the field of agriculture was studied, in terms of the effect of out-of-control noise as a result of several reasons, including the effect of environmental conditions on the observations of agricultural experiments, through the use of Discrete Wavelet transformation, specifically (The Coiflets transform of wavelength 1 to 2 and the Daubechies transform of wavelength 2 To 3) based on two levels of transform (J-4) and (J-5), and applying the hard threshold rules, soft and non-negative, and comparing the wavelet transformation methods using real data for an experiment with a size of 26 observations. The application was carried out through a program in the language of MATLAB. The researcher concluded that
... Show MoreAbstract:-
The approach maintenance and replacement one of techniques of operations research whom cares of the failure experienced by a lot of production lines which consist of a set of machines and equipment, which in turn exposed to the failure or work stoppages over the lifetime, which requires reducing the working time of these machines or equipment below what can or conuct maintenance process once in a while or a replacement for one part of the machine or replace one of the machines in production lines. In this research is the study of the failure s that occur in some parts of one of the machines for the General Company for Vege
... Show MoreBackground: Appreciation of the crucial role of risk factors in the development of coronary artery disease (CAD) is one of the most significant advances in the understanding of this important disease. Extensive epidemiological research has established cigarette smoking, diabetes, hyperlipidemia, and hypertension as independent risk factors for CADObjective: To determine the prevalence of the 4 conventional risk factors(cigarette smoking, diabetes, hyperlipidemia, and hypertension) among patients with CAD and to determine the correlation of Thrombolysis in Myocardial Infarction (TIMI) risk score with the extent of coronary artery disease (CAD) in patients with unstable angina /non ST elevation myocardial infarction (UA/NSTEMI).Methods: We
... Show MoreUsed in the study especially calibrated Erwa to determine the number of neighborhood or the Alayoshi number of bacteria in the count modeling and casting method dishes in addition to using the drop method yielded significant results for a match between the methods used ..
The research aims to identify the requirements of banking Entrepreneurial in Saudi Arabia and Singapore, where banking Entrepreneurial is an important way to lead employees to acquire the experience and knowledge required by the banking environment, so we note the pursuit of the banking management to acquire new technology proactively and distinctively to compete with others through the introduction of modern technologies that help senior management to develop new banking methods adaptable to the surrounding environmental changes. The problem of research highlights the extent to which the requirements of banking Entrepreneurial are applied in Saudi Arabia and the Republic of Singapore and will be addressed through three investigation
... Show MoreFlexible job-shop scheduling problem (FJSP) is one of the instances in flexible manufacturing systems. It is considered as a very complex to control. Hence generating a control system for this problem domain is difficult. FJSP inherits the job-shop scheduling problem characteristics. It has an additional decision level to the sequencing one which allows the operations to be processed on any machine among a set of available machines at a facility. In this article, we present Artificial Fish Swarm Algorithm with Harmony Search for solving the flexible job shop scheduling problem. It is based on the new harmony improvised from results obtained by artificial fish swarm algorithm. This improvised solution is sent to comparison to an overall best
... Show MoreThere is a great operational risk to control the day-to-day management in water treatment plants, so water companies are looking for solutions to predict how the treatment processes may be improved due to the increased pressure to remain competitive. This study focused on the mathematical modeling of water treatment processes with the primary motivation to provide tools that can be used to predict the performance of the treatment to enable better control of uncertainty and risk. This research included choosing the most important variables affecting quality standards using the correlation test. According to this test, it was found that the important parameters of raw water: Total Hardn
Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms re
... Show More