In this search, a new bioluminescent technique was proved for pyrophosphate which was employed to single- nucleotide polymorphism (SNP) diagnosis using one-base extension reaction. Four Mycobacterium tuberculosis genes were chosen (Rpob, InhA, KatG, GyrA) genes. Fifty-four specimens were used in this study fifty-three proved as drug-resistant specimens by The Iraqi Institute of Chest and Respiratory Diseases in Baghdad., also one specimen was used as a negative control. The procedure of this assay was as follows. A specific primer within each aliquot owning a short 3-OH end of the base of the target gene was hybridized to the single-stranded DNA template. Then, (exo-) Klenow DNA polymerase and one of either ?-thio-dATP, dTTP, dGTP, or dCTP
... Show MoreIn this paper, The transfer function model in the time series was estimated using different methods, including parametric Represented by the method of the Conditional Likelihood Function, as well as the use of abilities nonparametric are in two methods local linear regression and cubic smoothing spline method, This research aims to compare those capabilities with the nonlinear transfer function model by using the style of simulation and the study of two models as output variable and one model as input variable in addition t
... Show MoreIn this paper has been building a statistical model of the Saudi financial market using GARCH models that take into account Volatility in prices during periods of circulation, were also study the effect of the type of random error distribution of the time series on the accuracy of the statistical model, as it were studied two types of statistical distributions are normal distribution and the T distribution. and found by application of a measured data that the best model for the Saudi market is GARCH (1,1) model when the random error distributed t. student's .
In this paper, a discussion of the principles of stereoscopy is presented, and the phases
of 3D image production of which is based on the Waterfall model. Also, the results are based
on one of the 3D technology which is Anaglyph and it's known to be of two colors (red and
cyan).
A 3D anaglyph image and visualization technologies will appear as a threedimensional
by using a classes (red/cyan) as considered part of other technologies used and
implemented for production of 3D videos (movies). And by using model to produce a
software to process anaglyph video, comes very important; for that, our proposed work is
implemented an anaglyph in Waterfall model to produced a 3D image which extracted from a
video.
This study was aimed to investigate the response surface methodology (RSM) to evaluate the effects of various experimental conditions on the removal of levofloxacin (LVX) from the aqueous solution by means of electrocoagulation (EC) technique with stainless steel electrodes. The EC process was achieved successfully with the efficiency of LVX removal of 90%. The results obtained from the regression analysis, showed that the data of experiential are better fitted to the polynomial model of second-order with the predicted correlation coefficient (pred. R2) of 0.723, adjusted correlation coefficient (Adj. R2) of 0.907 and correlation coefficient values (R2) of 0.952. This shows that the predicted models and experimental values are in go
... Show MoreBig data of different types, such as texts and images, are rapidly generated from the internet and other applications. Dealing with this data using traditional methods is not practical since it is available in various sizes, types, and processing speed requirements. Therefore, data analytics has become an important tool because only meaningful information is analyzed and extracted, which makes it essential for big data applications to analyze and extract useful information. This paper presents several innovative methods that use data analytics techniques to improve the analysis process and data management. Furthermore, this paper discusses how the revolution of data analytics based on artificial intelligence algorithms might provide
... Show MoreThe data preprocessing step is an important step in web usage mining because of the nature of log data, which are heterogeneous, unstructured, and noisy. Given the scalability and efficiency of algorithms in pattern discovery, a preprocessing step must be applied. In this study, the sequential methodologies utilized in the preprocessing of data from web server logs, with an emphasis on sub-phases, such as session identification, user identification, and data cleansing, are comprehensively evaluated and meticulously examined.
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreIn this paper, the bowtie method was utilized by a multidisciplinary team in the Federal Board of Supreme Audit (FBSA)for the purpose of managing corruption risks threatening the Iraqi construction sector. Corruption in Iraq is a widespread phenomenon that threatens to degrade society and halt the wheel of economic development, so it must be reduced through appropriate strategies. A total of eleven corruption risks have been identified by the involved parties in corruption and were analyzed by using probability and impact matrix and their priority has been ranked. Bowtie analysis was conducted on four factors with high score risk in causing corruption in the planning stage. The number and effectiveness of the existing proactive meas
... Show MoreA novel design and implementation of a cognitive methodology for the on-line auto-tuning robust PID controller in a real heating system is presented in this paper. The aim of the proposed work is to construct a cognitive control methodology that gives optimal control signal to the heating system, which achieve the following objectives: fast and precise search efficiency in finding the on- line optimal PID controller parameters in order to find the optimal output temperature response for the heating system. The cognitive methodology (CM) consists of three engines: breeding engine based Routh-Hurwitz criterion stability, search engine based particle
swarm optimization (PSO) and aggregation knowledge engine based cultural algorithm (CA)