This study employs evolutionary optimization and Artificial Intelligence algorithms to determine an individual’s age using a single-faced image as the basis for the identification process. Additionally, we used the WIKI dataset, widely considered the most comprehensive collection of facial images to date, including descriptions of age and gender attributes. However, estimating age from facial images is a recent topic of study, even though much research has been undertaken on establishing chronological age from facial photographs. Retrained artificial neural networks are used for classification after applying reprocessing and optimization techniques to achieve this goal. It is possible that the difficulty of determining age could be reduced by using an algorithm that calculates the predicted value. The utilization of machine learning models that have been trained on massive datasets, the implementation of strategies for correct face alignment, and the utilization of expected value regression formulations have all been significantly incorporated into the suggested approach. The model’s performance is optimized and improved in this study by utilizing several distinct classifiers, increasing the effectiveness of explicit expectations. We aimed to optimize the selection of classifiers to minimize energy consumption while achieving a mean absolute error of 2.08 (average) and a power usage of 2700 W.
An optoelectronic flow-through detector for active ingredients determination in pharmaceutical formulations is explained. Two consecutive compact photodetector’s devices operating according to light-emitting diodes-solar cells concept where the LEDs acting as a light source and solar cells for measuring the attenuated light of the incident light at 180˚ have been developed. The turbidimetric detector, fabricated of ten light-emitting diodes and five solar cells only, integrated with a glass flow cell has been easily adapted in flow injection analysis manifold system. For active ingredients determination, the developed detector was successfully utilized for the development and validation of an analytical method for warfarin determination
... Show MoreThe convergence speed is the most important feature of Back-Propagation (BP) algorithm. A lot of improvements were proposed to this algorithm since its presentation, in order to speed up the convergence phase. In this paper, a new modified BP algorithm called Speeding up Back-Propagation Learning (SUBPL) algorithm is proposed and compared to the standard BP. Different data sets were implemented and experimented to verify the improvement in SUBPL.
The process of evaluating data (age and the gender structure) is one of the important factors that help any country to draw plans and programs for the future. Discussed the errors in population data for the census of Iraqi population of 1997. targeted correct and revised to serve the purposes of planning. which will be smoothing the population databy using nonparametric regression estimator (Nadaraya-Watson estimator) This estimator depends on bandwidth (h) which can be calculate it by two ways of using Bayesian method, the first when observations distribution is Lognormal Kernel and the second is when observations distribution is Normal Kernel
... Show MoreResource estimation is an essential part of reservoir evaluation and development planning which highly affects the decision-making process. The available conventional logs for 30 wells in Nasiriyah oilfield were used in this study to model the petrophysical properties of the reservoir and produce a 3D static geological reservoir model that mimics petrophysical properties distribution to estimate the stock tank oil originally in place (STOOIP) for Mishrif reservoir by volumetric method. Computer processed porosity and water saturation and a structural 2D map were utilized to construct the model which was discretized by 537840 grid blocks. These properties were distributed in 3D Space using sequential Gaussian simulation and the variation in
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
A non-parametric kernel method with Bootstrap technology was used to estimate the confidence intervals of the system failure function of the log-normal distribution trace data. These are the times of failure of the machines of the spinning department of the weaving company in Wasit Governorate. Estimating the failure function in a parametric way represented by the method of the maximum likelihood estimator (MLE). The comparison between the parametric and non-parametric methods was done by using the average of Squares Error (MES) criterion. It has been noted the efficiency of the nonparametric methods based on Bootstrap compared to the parametric method. It was also noted that the curve estimation is more realistic and appropriate for the re
... Show More