In this research, Argon gas was used to generate atmospheric plasma in the manufacture of platinum nanomaterials, to study the resultant plasma spectrum and to calculate the cellular toxicity of those manufactured nanomaterials. This research is keen on the generation of nonthermal atmospheric pressure plasma using aqueous platinum salts (H2PtCl6 6H2O) with different concentrations and exposure of cold plasma with a different time period used to produce platinum nanoparticles, to ensure typical preparation of nanoparticles. Visible UV and X-rays were performed for this purpose, and the diameter of the system probe was (1[Formula: see text]mm) with the Argon gas flow of 2.5[Formula: see text]min/L to prepare the platinum nanoparticles, and spectroscopic study of plasma parameter including, electron temperature, electron density, Debye length and plasma frequency, were computed using spectral analysis techniques. The effect of nanoparticles on natural lymphocytes was studied to calculate cytotoxicity and the greatest proportion was at the concentration of 100% nanoparticle platinum is 37.4%. The study results revealed that cold in the atmosphere is a promising technology when used in the production of nanoparticle materials which can be used for many industrial and medical applications.
This research presents a method for calculating stress ratio to predict fracture pressure gradient. It also, describes a correlation and list ideas about this correlation. Using the data collected from four wells, which are the deepest in southern Iraqi oil fields (3000 to 6000) m and belonged to four oil fields. These wells are passing through the following formations: Y, Su, G, N, Sa, Al, M, Ad, and B. A correlation method was applied to calculate fracture pressure gradient immediately in terms of both overburden and pore pressure gradient with an accurate results. Based on the results of our previous research , the data were used to calculate and plot the effective stresses. Many equations relating horizontal effective stress and vertica
... Show MoreThis paper presents a new algorithm in an important research field which is the semantic word similarity estimation. A new feature-based algorithm is proposed for measuring the word semantic similarity for the Arabic language. It is a highly systematic language where its words exhibit elegant and rigorous logic. The score of sematic similarity between two Arabic words is calculated as a function of their common and total taxonomical features. An Arabic knowledge source is employed for extracting the taxonomical features as a set of all concepts that subsumed the concepts containing the compared words. The previously developed Arabic word benchmark datasets are used for optimizing and evaluating the proposed algorithm. In this paper,
... Show MoreThis paper describes a new finishing process using magnetic abrasives were newly made to finish effectively brass plate that is very difficult to be polished by the conventional machining processes. Taguchi experimental design method was adopted for evaluating the effect of the process parameters on the improvement of the surface roughness and hardness by the magnetic abrasive polishing. The process parameters are: the applied current to the inductor, the working gap between the workpiece and the inductor, the rotational speed and the volume of powder. The analysis of variance(ANOVA) was analyzed using statistical software to identify the optimal conditions for better surface roughness and hardness. Regressions models based on statistical m
... Show MoreIn this paper , an efficient new procedure is proposed to modify third –order iterative method obtained by Rostom and Fuad [Saeed. R. K. and Khthr. F.W. New third –order iterative method for solving nonlinear equations. J. Appl. Sci .7(2011): 916-921] , using three steps based on Newton equation , finite difference method and linear interpolation. Analysis of convergence is given to show the efficiency and the performance of the new method for solving nonlinear equations. The efficiency of the new method is demonstrated by numerical examples.
Face recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show MoreThis paper focuses on developing a self-starting numerical approach that can be used for direct integration of higher-order initial value problems of Ordinary Differential Equations. The method is derived from power series approximation with the resulting equations discretized at the selected grid and off-grid points. The method is applied in a block-by-block approach as a numerical integrator of higher-order initial value problems. The basic properties of the block method are investigated to authenticate its performance and then implemented with some tested experiments to validate the accuracy and convergence of the method.
Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me