This study employs evolutionary optimization and Artificial Intelligence algorithms to determine an individual’s age using a single-faced image as the basis for the identification process. Additionally, we used the WIKI dataset, widely considered the most comprehensive collection of facial images to date, including descriptions of age and gender attributes. However, estimating age from facial images is a recent topic of study, even though much research has been undertaken on establishing chronological age from facial photographs. Retrained artificial neural networks are used for classification after applying reprocessing and optimization techniques to achieve this goal. It is possible that the difficulty of determining age could be reduced by using an algorithm that calculates the predicted value. The utilization of machine learning models that have been trained on massive datasets, the implementation of strategies for correct face alignment, and the utilization of expected value regression formulations have all been significantly incorporated into the suggested approach. The model’s performance is optimized and improved in this study by utilizing several distinct classifiers, increasing the effectiveness of explicit expectations. We aimed to optimize the selection of classifiers to minimize energy consumption while achieving a mean absolute error of 2.08 (average) and a power usage of 2700 W.
Fatty Acid Methyl Ester (FAME) produced from biomass offers several advantages such as renewability and sustainability. The typical production process of FAME is accompanied by various impurities such as alcohol, soap, glycerol, and the spent catalyst. Therefore, the most challenging part of the FAME production is the purification process. In this work, a novel application of bulk liquid membrane (BLM) developed from conventional solvent extraction methods was investigated for the removal of glycerol from FAME. The extraction and stripping processes are combined into a single system, allowing for simultaneous solvent recovery whereby low-cost quaternary ammonium salt-glycerol-based deep eutectic solvent (DES) is used as the membrane phase.
... Show MoreModern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the
... Show MoreStatistical learning theory serves as the foundational bedrock of Machine learning (ML), which in turn represents the backbone of artificial intelligence, ushering in innovative solutions for real-world challenges. Its origins can be linked to the point where statistics and the field of computing meet, evolving into a distinct scientific discipline. Machine learning can be distinguished by its fundamental branches, encompassing supervised learning, unsupervised learning, semi-supervised learning, and reinforcement learning. Within this tapestry, supervised learning takes center stage, divided in two fundamental forms: classification and regression. Regression is tailored for continuous outcomes, while classification specializes in c
... Show MoreTransforming the common normal distribution through the generated Kummer Beta model to the Kummer Beta Generalized Normal Distribution (KBGND) had been achieved. Then, estimating the distribution parameters and hazard function using the MLE method, and improving these estimations by employing the genetic algorithm. Simulation is used by assuming a number of models and different sample sizes. The main finding was that the common maximum likelihood (MLE) method is the best in estimating the parameters of the Kummer Beta Generalized Normal Distribution (KBGND) compared to the common maximum likelihood according to Mean Squares Error (MSE) and Mean squares Error Integral (IMSE) criteria in estimating the hazard function. While the pr
... Show MoreThe objective of this research was to estimate the dose distribution delivered by radioactive gold nanoparticles (198 AuNPs or 199 AuNPs) to the tumor inside the human prostate as well as to normal tissues surrounding the tumor using the Monte-Carlo N-Particle code (MCNP-6.1. 1 code). Background Radioactive gold nanoparticles are emerging as promising agents for cancer therapy and are being investigated to treat prostate cancer in animals. In order to use them as a new therapeutic modality to treat human prostate cancer, accurate radiation dosimetry simulations are required to estimate the energy deposition in the tumor and surrounding tissue and to establish the course of therapy for the patient. Materials and methods A simple geometrical
... Show Moren this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary func
... Show MoreThe survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show MoreThis paper aims to decide the best parameter estimation methods for the parameters of the Gumbel type-I distribution under the type-II censorship scheme. For this purpose, classical and Bayesian parameter estimation procedures are considered. The maximum likelihood estimators are used for the classical parameter estimation procedure. The asymptotic distributions of these estimators are also derived. It is not possible to obtain explicit solutions of Bayesian estimators. Therefore, Markov Chain Monte Carlo, and Lindley techniques are taken into account to estimate the unknown parameters. In Bayesian analysis, it is very important to determine an appropriate combination of a prior distribution and a loss function. Therefore, two different
... Show More