Iron-Epoxy composite samples were prepared by added
different weight percentages (0, 5, 10, 15, and 20 wt %) from Iron
particles in the range of (30-40μm) as a particle size. The contents
were mixed carefully, and placed a circular dies with a diameter of
2.5 cm. Different mechanical tests (Shore D Hardness, Tensile
strength, and Impact strength ) were carried out for all samples. The
samples were immersed in water for ten weeks, and after two weeks
the samples were take-out and drying to conducting all mechanical
tests were repeated for all samples. The hardness values increased
when the Iron particle concentration increased while the Impact
strength is not affected by the increasing of Iron particles
concentration. The tensile strength results reveal that the tensile
strength and the strain values of composite samples decreases when
the Iron particles concentration increase. After conducted
immersion processes the results of hardness are reduced wears the
results of tensile strength and the impact strength are increased.
Malware represents one of the dangerous threats to computer security. Dynamic analysis has difficulties in detecting unknown malware. This paper developed an integrated multi – layer detection approach to provide more accuracy in detecting malware. User interface integrated with Virus Total was designed as a first layer which represented a warning system for malware infection, Malware data base within malware samples as a second layer, Cuckoo as a third layer, Bull guard as a fourth layer and IDA pro as a fifth layer. The results showed that the use of fifth layers was better than the use of a single detector without merging. For example, the efficiency of the proposed approach is 100% compared with 18% and 63% of Virus Total and Bel
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreIn the United States, the pharmaceutical industry is actively devising strategies to improve the diversity of clinical trial participants. These efforts stem from a plethora of evidence indicating that various ethnic groups respond differently to a given treatment. Thus, increasing the diversity of trial participants would not only provide more robust and representative trial data but also lead to safer and more effective therapies. Further diversifying trial participants appear straightforward, but it is a complex process requiring feedback from multiple stakeholders such as pharmaceutical sponsors, regulators, community leaders, and research sites. Therefore, the objective of this paper is to describe three viable strategies that can p
... Show MoreAbstract
Black paint laser peening (bPLP) technique is currently applied for many engineering materials , especially for aluminum alloys due to high improvement in fatigue life and strength . Constant and variable bending fatigue tests have been performed at RT and stress ratio R= -1 . The results of the present work observed that the significance of the surface work hardening which generated high negative residual stresses in bPLP specimens .The fatigue life improvement factor (FLIF) for bPLP constant fatigue behavior was from 2.543 to 3.3 compared to untreated fatigue and the increase in fatigue strength at 107 cycle was 21% . The bPLP cumulative fatigue life behav
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreForm the series of generalization of the topic of supra topology is the generalization of separation axioms . In this paper we have been introduced (S * - SS *) regular spaces . Most of the properties of both spaces have been investigated and reinforced with examples . In the last part we presented the notations of supra *- -space ( =0,1) and we studied their relationship with (S * - SS *) regular spaces.
The current research aimed at deducing the psychological contents of Suliman dialogue with the Hodhod and statement applications in school counseling. The researcher followed the Islamic approach in the search, which deals with the study of events, phenomena and practices through a broad understanding of Islamic principles and limitations associated with the general framework of Islam. In addition to the deductive approach is derived from a sub-rule is a general provision.
The research revealed many of the psychological contents, including: the importance of continuing care counselor psychological learners, and follow-up field to their problems, conditions, listen good horseshoe to defend himself, clarify the motives of his ac
The nucleon momentum distributions (NMD) for the ground state and elastic electron scattering form factors have been calculated in the framework of the coherent fluctuation model and expressed in terms of the weight function (fluctuation function). The weight function has been related to the nucleon density distributions of nuclei and determined from theory and experiment. The nucleon density distributions (NDD) is derived from a simple method based on the use of the single particle wave functions of the harmonic oscillator potential and the occupation numbers of the states. The feature of long-tail behavior at high momentum region of the NMD has been obtained using both the theoretical and experimental weight functions. The observed ele
... Show MoreThe soft sets were known since 1999, and because of their wide applications and their great flexibility to solve the problems, we used these concepts to define new types of soft limit points, that we called soft turning points.Finally, we used these points to define new types of soft separation axioms and we study their properties.
In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent r
... Show More