The idea of carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeeding and maternal health. The maternal health variable contained missing value and was processed in Matlab2015a using Methods Principal Component Analysis and probabilistic Principal Component Analysis of where the missing values were processed and then the methods were compared using the root of the mean error squares. The best method to processed the missing values Was the PCA method.
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreA new algorithm is proposed to compress speech signals using wavelet transform and linear predictive coding. Signal compression based on the concept of selecting a small number of approximation coefficients after they are compressed by the wavelet decomposition (Haar and db4) at a suitable chosen level and ignored details coefficients, and then approximation coefficients are windowed by a rectangular window and fed to the linear predictor. Levinson Durbin algorithm is used to compute LP coefficients, reflection coefficients and predictor error. The compress files contain LP coefficients and previous sample. These files are very small in size compared to the size of the original signals. Compression ratio is calculated from the size of th
... Show MoreThe density-based spatial clustering for applications with noise (DBSCAN) is one of the most popular applications of clustering in data mining, and it is used to identify useful patterns and interesting distributions in the underlying data. Aggregation methods for classifying nonlinear aggregated data. In particular, DNA methylations, gene expression. That show the differentially skewed by distance sites and grouped nonlinearly by cancer daisies and the change Situations for gene excretion on it. Under these conditions, DBSCAN is expected to have a desirable clustering feature i that can be used to show the results of the changes. This research reviews the DBSCAN and compares its performance with other algorithms, such as the tradit
... Show MoreIn this paper, wireless network is planned; the network is predicated on the IEEE 802.16e standardization by WIMAX. The targets of this paper are coverage maximizing, service and low operational fees. WIMAX is planning through three approaches. In approach one; the WIMAX network coverage is major for extension of cell coverage, the best sites (with Band Width (BW) of 5MHz, 20MHZ per sector and four sectors per each cell). In approach two, Interference analysis in CNIR mode. In approach three of the planning, Quality of Services (QoS) is tested and evaluated. ATDI ICS software (Interference Cancellation System) using to perform styling. it shows results in planning area covered 90.49% of the Baghdad City and used 1000 mob
... Show More
Abstract:
The models of time series often suffer from the problem of the existence of outliers that accompany the data collection process for many reasons, their existence may have a significant impact on the estimation of the parameters of the studied model. Access to highly efficient estimators is one of the most important stages of statistical analysis, And it is therefore important to choose the appropriate methods to obtain good estimators. The aim of this research is to compare the ordinary estimators and the robust estimators of the estimation of the parameters of
... Show MoreMetaphor is one of the most important linguistic phenomena of the artistic text, as it is the expression of the author’s emotions and evaluations, the result of a deep inner transformation of the semantic words and visual means of reflecting the national culture of each people. This paper examines the concept of linguistic metaphors and analyzes its types in the Russian and Arabic linguistics, provides a comparative analysis of metaphors in Russian and Arabic — all this allows to conclude that metaphorization is characteris- tic of different parts of speech. In the Russian language stylistic differentiation of the metaphors expressed more than in Arabic, so translation of many “sty- listic” metaphors from Russian into Arabic due to
... Show MoreThis paper presents numerical and experimental stress analyses to evaluate the contact and bending stresses on the teeth of spiral bevel gear drive. Finite Element Method has been adopted as a numerical technique which accomplished basically by using ANSYS software package. The experimental stress analysis has been achieved by using a gear tooth model made of Castolite material which has photoelastic properties. The main goal of this research is detecting the maximum tooth stresses to avoid the severe areas that caused tooth failure and to increase the working life for this type of gear drives.
Computer analysis of simple eye model is performed in the present work by using the Zemax optical design software 2000E . The most important optical parameters of the eye were calculated such as the effective focal length (EFL) , the image spot size at the retina and found to be in a reasonable agreement with the values needed for the laser retinal treatment .The present eye model leads to an effective wavelength and we found the image spot diagram at the surface of the retina and the wavefront error which are provided at zero field angle. This gives a good evidence of the validity of the model in one hand, and can be used to determine the compatibility of any optical design intended for visual applications. By using the pulse fre
... Show More