The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
Proxy-based sliding mode control PSMC is an improved version of PID control that combines the features of PID and sliding mode control SMC with continuously dynamic behaviour. However, the stability of the control architecture maybe not well addressed. Consequently, this work is focused on modification of the original version of the proxy-based sliding mode control PSMC by adding an adaptive approximation compensator AAC term for vibration control of an Euler-Bernoulli beam. The role of the AAC term is to compensate for unmodelled dynamics and make the stability proof more easily. The stability of the proposed control algorithm is systematically proved using Lyapunov theory. Multi-modal equation of motion is derived using the Galerkin metho
... Show MoreThe aim of this research is to estimate the parameters of the linear regression model with errors following ARFIMA model by using wavelet method depending on maximum likelihood and approaching general least square as well as ordinary least square. We use the estimators in practical application on real data, which were the monthly data of Inflation and Dollar exchange rate obtained from the (CSO) Central Statistical organization for the period from 1/2005 to 12/2015. The results proved that (WML) was the most reliable and efficient from the other estimators, also the results provide that the changing of fractional difference parameter (d) doesn’t effect on the results.
Multiple linear regressions are concerned with studying and analyzing the relationship between the dependent variable and a set of explanatory variables. From this relationship the values of variables are predicted. In this paper the multiple linear regression model and three covariates were studied in the presence of the problem of auto-correlation of errors when the random error distributed the distribution of exponential. Three methods were compared (general least squares, M robust, and Laplace robust method). We have employed the simulation studies and calculated the statistical standard mean squares error with sample sizes (15, 30, 60, 100). Further we applied the best method on the real experiment data representing the varieties of
... Show MoreA geographic information system (GIS) is a very effective management and analysis tool. Geographic locations rely on data. The use of artificial neural networks (ANNs) for the interpretation of natural resource data has been shown to be beneficial. Back-propagation neural networks are one of the most widespread and prevalent designs. The combination of geographic information systems with artificial neural networks provides a method for decreasing the cost of landscape change studies by shortening the time required to evaluate data. Numerous designs and kinds of ANNs have been created; the majority of them are PC-based service domains. Using the ArcGIS Network Analyst add-on, you can locate service regions around any network
... Show MoreIn information security, fingerprint verification is one of the most common recent approaches for verifying human identity through a distinctive pattern. The verification process works by comparing a pair of fingerprint templates and identifying the similarity/matching among them. Several research studies have utilized different techniques for the matching process such as fuzzy vault and image filtering approaches. Yet, these approaches are still suffering from the imprecise articulation of the biometrics’ interesting patterns. The emergence of deep learning architectures such as the Convolutional Neural Network (CNN) has been extensively used for image processing and object detection tasks and showed an outstanding performance compare
... Show MoreArtificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing flower pollination algorithm in the environmental field to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement. We estimate the regression function of the semi-parametric model by estimating the parametric model and estimating the non-parametric model, the parametric model is estimated by using an instrumental variables method (Wald method, Bartlett’s method, and Durbin
... Show MoreThis paper introduces a non-conventional approach with multi-dimensional random sampling to solve a cocaine abuse model with statistical probability. The mean Latin hypercube finite difference (MLHFD) method is proposed for the first time via hybrid integration of the classical numerical finite difference (FD) formula with Latin hypercube sampling (LHS) technique to create a random distribution for the model parameters which are dependent on time [Formula: see text]. The LHS technique gives advantage to MLHFD method to produce fast variation of the parameters’ values via number of multidimensional simulations (100, 1000 and 5000). The generated Latin hypercube sample which is random or non-deterministic in nature is further integ
... Show MorePure Cu (CZTSe) and Ag dopant CZTSe (CAZTSe) thin films with Ag content of 0.1 and 0.2 were fabricated on coring glass substrate at R.T with thickness of 800nm by thermal evaporation method. Comparison between the optical characteristics of pure Cu and Ag alloying thin films was done by measuring and analyzing the absorbance and transmittance spectra in the range of (400-1100)nm. Also, the effect of annealing temperature at 373K and 473K on these characteristics was studied. The results indicated that all films had high absorbance and low transmittance in visible region, and the direct bang gap of films decreases with increasing Ag content and annealing temperature. Optical parameters like extinction coefficientrefractive index, and
... Show More