The idea of carrying out research on incomplete data came from the circumstances of our dear country and the horrors of war, which resulted in the missing of many important data and in all aspects of economic, natural, health, scientific life, etc.,. The reasons for the missing are different, including what is outside the will of the concerned or be the will of the concerned, which is planned for that because of the cost or risk or because of the lack of possibilities for inspection. The missing data in this study were processed using Principal Component Analysis and self-organizing map methods using simulation. The variables of child health and variables affecting children's health were taken into account: breastfeeding and maternal health. The maternal health variable contained missing value and was processed in Matlab2015a using Methods Principal Component Analysis and probabilistic Principal Component Analysis of where the missing values were processed and then the methods were compared using the root of the mean error squares. The best method to processed the missing values Was the PCA method.
This research began by explaining its variables and dimensions especially the digital gap, which the authors explained it elaborately beginning with the concept, the reasons blind its emergence of its measurement, and how to treat it. The authors supposed the potentiality of relying on enforcing knowledge in general and the groups suffer from this gap in particular, especially the targeted knowledge to treat its subject.
As enforcing knowledge usually depends on some strategies or choices of organizational orientation among them is learning and training from one side, and communication, as an indicating factor for organizational effectiveness as the authors refer from the other side.
This paper presents two main parts: The first part involves manufacturing the specimens form composite material for mechanical testing (tensile, flexural and fatigue tests), then design a custom foot orthesis (CFO) and manufacturing from composite lamination (3nylglass 2carbon fiber 3nylglass) for patient suffer from flexible flat foot since birth and over-pronation. The second part of this research involves a design a model of custom foot orthesis in (solid work 2018) and then analysis of custom foot orthosis in engineering analysis program (ANSYS V.18.2).The applied pressure in boundary condition adopted from Force Sensor Resistance (FSR 402 ) in various regions in foot after wearing composite CFO. Used a composite materials in engineerin
... Show MoreIn this paper, wireless network is planned; the network is predicated on the IEEE 802.16e standardization by WIMAX. The targets of this paper are coverage maximizing, service and low operational fees. WIMAX is planning through three approaches. In approach one; the WIMAX network coverage is major for extension of cell coverage, the best sites (with Band Width (BW) of 5MHz, 20MHZ per sector and four sectors per each cell). In approach two, Interference analysis in CNIR mode. In approach three of the planning, Quality of Services (QoS) is tested and evaluated. ATDI ICS software (Interference Cancellation System) using to perform styling. it shows results in planning area covered 90.49% of the Baghdad City and used 1000 mob
... Show MoreThe density-based spatial clustering for applications with noise (DBSCAN) is one of the most popular applications of clustering in data mining, and it is used to identify useful patterns and interesting distributions in the underlying data. Aggregation methods for classifying nonlinear aggregated data. In particular, DNA methylations, gene expression. That show the differentially skewed by distance sites and grouped nonlinearly by cancer daisies and the change Situations for gene excretion on it. Under these conditions, DBSCAN is expected to have a desirable clustering feature i that can be used to show the results of the changes. This research reviews the DBSCAN and compares its performance with other algorithms, such as the tradit
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More
