The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
. In recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction a
... Show MoreIt must be emphasized that media is amongst human studies fusing older and more recent sciences together, and that its disclosures are the physics of the new communication. Michio Kaku, a theoretical physicist, in his book “ Visions”, confirms this fact when he says :” As a research physicist, I believe that physicists have been particularly successful at predicting the broad outlines of the future .Professionally, I work in one of the most fundamental areas of physics, the quest to complete Einstein's dream of a "theory of everything." As a result, I am constantly reminded of the ways in which quantum physics touches many of the key discoveries that shaped the twentieth century. “ He then got to the fact that the physical disclo
... Show MoreIn recent years, Bitcoin has become the most widely used blockchain platform in business and finance. The goal of this work is to find a viable prediction model that incorporates and perhaps improves on a combination of available models. Among the techniques utilized in this paper are exponential smoothing, ARIMA, artificial neural networks (ANNs) models, and prediction combination models. The study's most obvious discovery is that artificial intelligence models improve the results of compound prediction models. The second key discovery was that a strong combination forecasting model that responds to the multiple fluctuations that occur in the bitcoin time series and Error improvement should be used. Based on the results, the prediction acc
... Show MoreIn this paper we reported the microfabrication of three-dimensional structures using two-photon polymerization (2PP) in a mixture of MEH-PPV and an acrylic resin. Femtosecond laser operating at 800nm was employed for the two-photon polymerization processes. As a first step in this project we obtained the better composition in order to fabricate microstructers of MEH-PPV in the resin via two-photon polymerzation. Acknowledgement:This research is support by Mazur Group, Harvrad Universirt.
This study had succeeded in producing a new graphical representation of James abacus called nested chain abacus. Nested chain abacus provides a unique mathematical expression to encode each tile (image) using a partition theory where each form or shape of tile will be associated with exactly one partition.Furthermore, an algorithm of nested chain abacus movement will be constructed, which can be applied in tiling theory.
In this paper a WLAN network that accesses the Internet through a GPRS network was implemented and tested. The proposed network is managed by the Linux based server. Because of the limited facilities of GPRS such as dynamic IP addressing besides to its limited bandwidth a number of techniques are implemented to overcome these limitations.
Dynamic Host Configuration Protocol (DHCP) server was added to provide a single central control for all TCP/IP resources. Squid Proxy was added to provide caching of the redundant accessed Web content to reduce the Internet bandwidth usage and speeding up the client’s download time. Network Address Translation (NAT) service was configured to share one IP ad
... Show MoreBackground:Oriental sore occurs mostly in the
mediteranian region , North Africa ,and the Middle East .
Rodents are the main reservoir for the parasite . The wet
type caused by L. major is rural and the dry type caused by
L. tropica is urban and humans are presumably the only
reservoir. Sand fly vectors are involved in all forms.
Objectives: This study aimed to show the most
important bacterial infections concomitant with cutaneous
leishmaniasis .
Methods; The study was performed on 75 patients (ages
1-50 years ) from both sexes were attending Skin Diseases
Department of Ramadi General Hospital during the period
extended from January to June 2000. These patients were
clinically diagnosed as patients
Spelling correction is considered a challenging task for resource-scarce languages. The Arabic language is one of these resource-scarce languages, which suffers from the absence of a large spelling correction dataset, thus datasets injected with artificial errors are used to overcome this problem. In this paper, we trained the Text-to-Text Transfer Transformer (T5) model using artificial errors to correct Arabic soft spelling mistakes. Our T5 model can correct 97.8% of the artificial errors that were injected into the test set. Additionally, our T5 model achieves a character error rate (CER) of 0.77% on a set that contains real soft spelling mistakes. We achieved these results using a 4-layer T5 model trained with a 90% error inject
... Show MoreThe purified prepared compounds were identified through different methods of identification i.e, I.R, UV-vi^ble-spectroscopy in addition to (coloured tests) Calculation of the sum of OH groups. TLC techniques were also used to test the purity and the speed ofthe rate of flow (RF).