Due to the availability of technology stemming from in-depth research in this sector and the drawbacks of other identifying methods, biometrics has drawn maximum attention and established itself as the most reliable alternative for recognition in recent years. Efforts are still being made to develop a user-friendly system that is up to par with security-system requirements and yields more reliable outcomes while safeguarding assets and ensuring privacy. Human age estimation and Gender identification are both challenging endeavours. Biomarkers and methods for determining biological age and gender have been extensively researched, and each has advantages and disadvantages. Facial-image-based positioning is crucial for many applications, including safety and security systems, border control, human engagement in sophisticated ambient analytics, and biometric identification. Determining a person's age and gender is a complex study method. With the advent of deep learning, the study of face systems has been completely transformed, and estimation accuracy is a crucial parameter for evaluating algorithms and their efficacy in predicting absolute ages. The UTKFace dataset, which serves as the backbone of the face estimating system, was used to assess the method. The eyes, cheeks, nose, lips, and forehead provide the foundation of this function. AlexNet achieves a 98% accuracy rate across its lifespan of system results.
The denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show MoreSteganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreThis article aim to estimate the Return Stock Rate of the private banking sector, with two banks, by adopting a Partial Linear Model based on the Arbitrage Pricing Model (APT) theory, using Wavelet and Kernel Smoothers. The results have proved that the wavelet method is the best. Also, the results of the market portfolio impact and inflation rate have proved an adversely effectiveness on the rate of return, and direct impact of the money supply.
Regression models are one of the most important models used in modern studies, especially research and health studies because of the important results they achieve. Two regression models were used: Poisson Regression Model and Conway-Max Well- Poisson), where this study aimed to make a comparison between the two models and choose the best one between them using the simulation method and at different sample sizes (n = 25,50,100) and with repetitions (r = 1000). The Matlab program was adopted.) to conduct a simulation experiment, where the results showed the superiority of the Poisson model through the mean square error criterion (MSE) and also through the Akaiki criterion (AIC) for the same distribution.
Paper type:
... Show MoreCloud point extraction is a simple, safe, and environmentally friendly technique for preparing many different kinds of samples. In this review, we discussed the CPE method and how to apply it to our environmental sample data. We also spoke about the benefits, problems, and likely developments in CPE. This process received a great deal of attention during preconcentration and extraction. It was used as a disconnection and follow-up improvement system before the natural mixtures (nutrients, polybrominated biphenyl ethers, pesticides, polycyclic sweet-smelling hydrocarbons, polychlorinated compounds, and fragrant amines) and inorganic mixtures were examined and many metals like (silver, lead, cadmium, mercury, and so on). We also find
... Show MoreECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.
With the escalation of cybercriminal activities, the demand for forensic investigations into these crimeshas grown significantly. However, the concept of systematic pre-preparation for potential forensicexaminations during the software design phase, known as forensic readiness, has only recently gainedattention. Against the backdrop of surging urban crime rates, this study aims to conduct a rigorous andprecise analysis and forecast of crime rates in Los Angeles, employing advanced Artificial Intelligence(AI) technologies. This research amalgamates diverse datasets encompassing crime history, varioussocio-economic indicators, and geographical locations to attain a comprehensive understanding of howcrimes manifest within the city. Lev
... Show MoreNumeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show MoreA liquid-solid chromatography of Bovine Serum Albumin (BSA) on (diethylaminoethyl-cellulose) DEAE-cellulose adsorbent is worked experimentally, to study the effect of changing the influent concentration of (0.125, 0.25, 0.5, and 1 mg/ml) at constant volumetric flow rate Q=1ml/min. And the effect of changing the volumetric flow rate (1, 3, 5, and 10 ml/min) at constant influent concentration of Co=0.125mg/ml. By using a glass column of (1.5cm) I.D and (50cm) length, packed with adsorbent of DEAE-cellulose of height (7cm). The influent is introduced in to the column using peristaltic pump and the effluent concentration is investigated using UV-spectrophotometer at 30oC and 280nm wavelength. A spread (steeper) break-through curve is gained
... Show More