This research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB), and Support Vector Machine (SVM) techniques. CART gives clear results with high accuracy between the six supervised algorithms. It is worth noting that the preprocessing steps take remarkable efforts to handle this type of data, since its pure data set has so many null values of a ratio 94.8%, then it becomes 0% after achieving the preprocessing steps. Then, in order to apply CART algorithm, several determined tests were assumed as classes. The decision to select the tests which had been assumed as classes were depending on their acquired accuracy. Consequently, enabling the physicians to trace and connect the tests result with each other, which extends its impact on patients’ health.
The regression analysis process is used to study and predicate the surface response by using the design of experiment (DOE) as well as roughness calculation through developing a mathematical model. In this study; response surface methodology and the particular solution technique are used. Design of experiment used a series of the structured statistical analytic approach to investigate the relationship between some parameters and their responses. Surface roughness is one of the important parameters which play an important role. Also, its found that the cutting speed can result in small effects on surface roughness. This work is focusing on all considerations to make interaction between the parameters (position of influenc
... Show MoreIn the last two decades, arid and semi-arid regions of China suffered rapid changes in the Land Use/Cover Change (LUCC) due to increasing demand on food, resulting from growing population. In the process of this study, we established the land use/cover classification in addition to remote sensing characteristics. This was done by analysis of the dynamics of (LUCC) in Zhengzhou area for the period 1988-2006. Interpretation of a laminar extraction technique was implied in the identification of typical attributes of land use/cover types. A prominent result of the study indicates a gradual development in urbanization giving a gradual reduction in crop field area, due to the progressive economy in Zhengzhou. The results also reflect degradati
... Show MoreToday, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreIn this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreDeep Learning Techniques For Skull Stripping of Brain MR Images
A new two-way nesting technique is presented for a multiple nested-grid ocean modelling system. The new technique uses explicit center finite difference and leapfrog schemes to exchange information between the different subcomponents of the nested-grid system. The performance of the different nesting techniques is compared, using two independent nested-grid modelling systems. In this paper, a new nesting algorithm is described and some preliminary results are demonstrated. The validity of the nesting method is shown in some problems for the depth averaged of 2D linear shallow water equation.
Hiding technique for dynamic encryption text using encoding table and symmetric encryption method (AES algorithm) is presented in this paper. The encoding table is generated dynamically from MSB of the cover image points that used as the first phase of encryption. The Harris corner point algorithm is applied on cover image to generate the corner points which are used to generate dynamic AES key to second phase of text encryption. The embedded process in the LSB for the image pixels except the Harris corner points for more robust. Experimental results have demonstrated that the proposed scheme have embedding quality, error-free text recovery, and high value in PSNR.
General Background: Deep image matting is a fundamental task in computer vision, enabling precise foreground extraction from complex backgrounds, with applications in augmented reality, computer graphics, and video processing. Specific Background: Despite advancements in deep learning-based methods, preserving fine details such as hair and transparency remains a challenge. Knowledge Gap: Existing approaches struggle with accuracy and efficiency, necessitating novel techniques to enhance matting precision. Aims: This study integrates deep learning with fusion techniques to improve alpha matte estimation, proposing a lightweight U-Net model incorporating color-space fusion and preprocessing. Results: Experiments using the AdobeComposition-1k
... Show More<span>We present the linearization of an ultra-wideband low noise amplifier (UWB-LNA) operating from 2GHz to 11GHz through combining two linearization methods. The used linearization techniques are the combination of post-distortion cancellation and derivative-superposition linearization methods. The linearized UWB-LNA shows an improved linearity (IIP3) of +12dBm, a minimum noise figure (NF<sub>min.</sub>) of 3.6dB, input and output insertion losses (S<sub>11</sub> and S<sub>22</sub>) below -9dB over the entire working bandwidth, midband gain of 6dB at 5.8GHz, and overall circuit power consumption of 24mW supplied from a 1.5V voltage source. Both UWB-LNA and linearized UWB-LNA designs are
... Show MoreWater saturation is the most significant characteristic for reservoir characterization in order to assess oil reserves; this paper reviewed the concepts and applications of both classic and new approaches to determine water saturation. so, this work guides the reader to realize and distinguish between various strategies to obtain an appropriate water saturation value from electrical logging in both resistivity and dielectric has been studied, and the most well-known models in clean and shaly formation have been demonstrated. The Nuclear Magnetic Resonance in conventional and nonconventional reservoirs has been reviewed and understood as the major feature of this approach to estimate Water Saturation based on T2 distribution. Artific
... Show More