Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve Bayesian classifier (NBC) have been enhanced as compared to the dataset before applying the proposed method. Moreover, the results indicated that issa was performed better than the statistical imputation techniques such as deleting the samples with missing values, replacing the missing values with zeros, mean, or random values.
Introduction: Carrier-based gutta-percha is an effective method of root canal obturation creating a 3-dimensional filling; however, retrieval of the plastic carrier is relatively difficult, particularly with smaller sizes. The purpose of this study was to develop composite carriers consisting of polyethylene (PE), hydroxyapatite (HA), and strontium oxide (SrO) for carrier-based root canal obturation. Methods: Composite fibers of HA, PE, and SrO were fabricated in the shape of a carrier for delivering gutta-percha (GP) using a melt-extrusion process. The fibers were characterized using infrared spectroscopy and the thermal properties determined using differential scanning calorimetry. The elastic modulus and tensile strength tests were dete
... Show MoreAfamin, which is a human plasma glycoprotein, a putative multifunctional transporter of hydrophobic molecules and a marker for metabolic syndrome. Afamin concentration have been proposed to have a significant role as a predictor of metabolic disorders. Since NAFLD is associated with metabolic risk factors, e.g., dyslipidemia, insulin resistance and visceral obesity, it is considered as the hepatic manifestation of the metabolic syndrome. The objective of this study is to determine Afamin levels in hypothyroid patients with and without fatty liver disease and compare the results with controls. Also to study the relationship of Afamin level with the Anthropometric and Clinical Features (Age, Gender, BMI and Duration of Hypothyroidism) , Serum
... Show MoreIn this work, the performance of the receiver in a quantum cryptography system based on BB84 protocol is scaled by calculating the Quantum Bit Error Rate (QBER) of the receiver. To apply this performance test, an optical setup was arranged and a circuit was designed and implemented to calculate the QBER. This electronic circuit is used to calculate the number of counts per second generated by the avalanche photodiodes set in the receiver. The calculated counts per second are used to calculate the QBER for the receiver that gives an indication for the performance of the receiver. Minimum QBER, 6%, was obtained with avalanche photodiode excess voltage equals to 2V and laser diode power of 3.16 nW at avalanche photodiode temperature of -10
... Show MoreOne of the most popular and legally recognized behavioral biometrics is the individual's signature, which is used for verification and identification in many different industries, including business, law, and finance. The purpose of the signature verification method is to distinguish genuine from forged signatures, a task complicated by cultural and personal variances. Analysis, comparison, and evaluation of handwriting features are performed in forensic handwriting analysis to establish whether or not the writing was produced by a known writer. In contrast to other languages, Arabic makes use of diacritics, ligatures, and overlaps that are unique to it. Due to the absence of dynamic information in the writing of Arabic signatures,
... Show MoreConfocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and e
... Show MoreThe Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreThe data communication has been growing in present day. Therefore, the data encryption became very essential in secured data transmission and storage and protecting data contents from intruder and unauthorized persons. In this paper, a fast technique for text encryption depending on genetic algorithm is presented. The encryption approach is achieved by the genetic operators Crossover and mutation. The encryption proposal technique based on dividing the plain text characters into pairs, and applying the crossover operation between them, followed by the mutation operation to get the encrypted text. The experimental results show that the proposal provides an important improvement in encryption rate with comparatively high-speed Process
... Show MoreThis paper proposes a novel meta-heuristic optimization algorithm called the fine-tuning meta-heuristic algorithm (FTMA) for solving global optimization problems. In this algorithm, the solutions are fine-tuned using the fundamental steps in meta-heuristic optimization, namely, exploration, exploitation, and randomization, in such a way that if one step improves the solution, then it is unnecessary to execute the remaining steps. The performance of the proposed FTMA has been compared with that of five other optimization algorithms over ten benchmark test functions. Nine of them are well-known and already exist in the literature, while the tenth one is proposed by the authors and introduced in this article. One test trial was shown t
... Show MoreRegression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show More