The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled as a risk haplotype. Unfortunately, the in-silico reconstruction of haplotypes might produce a proportion of false haplotypes which hamper the detection of rare but true haplotypes. Here, to address the issue, we propose an alternative approach: In Stage 1, we cluster genotypes instead of inferred haplotypes and estimate the risk genotypes based on a finite mixture model. In Stage 2, we infer risk haplotypes from risk genotypes inferred from the previous stage. To estimate the finite mixture model, we propose an EM algorithm with a novel data partition-based initialization. The performance of the proposed procedure is assessed by simulation studies and a real data analysis. Compared to the existing multiple Z-test procedure, we find that the power of genome-wide association studies can be increased by using the proposed procedure.
This research dealt with the subject of auditing bank credit risks in accordance with international auditing standards and aims to develop procedures and design a credit risk audit program in accordance with international auditing standards and demonstrate their impact on the truth, truthfulness and fairness of financial statements and on their overall performance and continuity in the banking sector Its importance lies in relying on international auditing standards to assess and measure bank credit risk and its impact on the financial situation as well as the ability to predict financial failure. A set of conclusions have been reached, the most important of which is that the bank faces difficulties in measuring credit risk in accordance
... Show MoreBig data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreHuman serum albumin (HSA) nanoparticles have been widely used as versatile drug delivery systems for improving the efficiency and pharmaceutical properties of drugs. The present study aimed to design HSA nanoparticle encapsulated with the hydrophobic anticancer pyridine derivative (2-((2-([1,1'-biphenyl]-4-yl)imidazo[1,2-a]pyrimidin-3-yl)methylene)hydrazine-1-carbothioamide (BIPHC)). The synthesis of HSA-BIPHC nanoparticles was achieved using a desolvation process. Atomic force microscopy (AFM) analysis showed the average size of HSA-BIPHC nanoparticles was 80.21 nm. The percentages of entrapment efficacy, loading capacity and production yield were 98.11%, 9.77% and 91.29%, respectively. An In vitro release study revealed that HSA-BIPHC nan
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreA simple, sensitive, accurate and low cost effective spectrophotometric method has been developed for the determination of Tetracycline and Doxycycline in pure and pharmaceutical formulations. The method is based on the reaction of methyldopa with 4-aminoantipyren (4-AAP) in presence of potassium ferriecyanide (PFC) in an alkaline medium. Two optimization methods were applied to determine the optimum conditions of oxidizing coupling reaction variables; univariate and design of experiment (DOE) method. The conditions effecting the reaction; pH, buffer Volume, reagent concentration, oxidant concentration, type of buffer, order of addition, time of reaction and stability were optimized . Under univariate and design
... Show MoreThe objective of the research is to identify the efficiency of risk management in various names at Baghdad International Airport in the face of various risks (financial - technical - human - natural ..) facing the sample of the search of the General Establishment of Civil Aviation and the Iraqi Airways Company where the researcher identified the hypothesis that summarizes There is a significant significant correlation between risk management, risk management and risk review and assessment. The researcher used the means of research from observation and interviews with the relevant officials in this field, as well as used the questionnaire and distributed a sample of 170 employees in the field of risk management (SMS Department) in Iraqi A
... Show More