There is various human biometrics used nowadays, one of the most important of these biometrics is the face. Many techniques have been suggested for face recognition, but they still face a variety of challenges for recognizing faces in images captured in the uncontrolled environment, and for real-life applications. Some of these challenges are pose variation, occlusion, facial expression, illumination, bad lighting, and image quality. New techniques are updating continuously. In this paper, the singular value decomposition is used to extract the features matrix for face recognition and classification. The input color image is converted into a grayscale image and then transformed into a local ternary pattern before splitting the image into the main sixteen blocks. Each block of these sixteen blocks is divided into more to thirty sub-blocks. For each sub-block, the SVD transformation is applied, and the norm of the diagonal matrix is calculated, which is used to create the 16x30 feature matrix. The sub-blocks of two images, (thirty elements in the main block) are compared with others using the Euclidean distance. The minimum value for each main block is selected to be one feature input to the neural network. Classification is implemented by a backpropagation neural network, where a 16-feature matrix is used as input to the neural network. The performance of the current proposal was up to 97% when using the FEI (Brazilian) database. Moreover, the performance of this study is promised when compared with recent state-of-the-art approaches and it solves some of the challenges such as illumination and facial expression.
This study aims to develop a recommendation engine methodology to enhance the model’s effectiveness and efficiency. The proposed model is commonly used to assign or propose a limited number of developers with the required skills and expertise to address and resolve a bug report. Managing collections within bug repositories is the responsibility of software engineers in addressing specific defects. Identifying the optimal allocation of personnel to activities is challenging when dealing with software defects, which necessitates a substantial workforce of developers. Analyzing new scientific methodologies to enhance comprehension of the results is the purpose of this analysis. Additionally, developer priorities were discussed, especially th
... Show MoreThe research aims to identify the importance of applying resource consumption accounting in the Iraqi industrial environment in general, and oil in particular, and its role in reducing the costs of activities by excluding and isolating idle energy costs, as the research problem represents that the company faces deficiencies and challenges in applying strategic cost tools. The research was based on The hypothesis that the application of resource consumption accounting will lead to the provision of appropriate information for the company through the allocation of costs properly by resource consumption accounting and then reduce the costs of activities. To prove the hypothesis of the research, the Light Derivatives Authority - Al-Dora Refin
... Show MoreThis study examines the relationship between the increase in the number of tourists coming to Tunisia and GDP during the period 1995-2017, using the methodology of joint integration, causal testing and error correction model. The research found the time series instability of the logarithm of the number of tourists coming to Tunisia and the output logarithm but after applying the first differences, these chains become stable, THUS these time series are integrated in the first differences. Using the Johansson method, we found the possibility of a simultaneous integration relationship between the logarithm of the number of tourists coming to Tunisia and the logarithm of GDP in Tunisia, and there is a causal relationship in one direc
... Show MoreIn this paper, a Modified Weighted Low Energy Adaptive Clustering Hierarchy (MW-LEACH) protocol is implemented to improve the Quality of Service (QoS) in Wireless Sensor Network (WSN) with mobile sink node. The Quality of Service is measured in terms of Throughput Ratio (TR), Packet Loss Ratio (PLR) and Energy Consumption (EC). The protocol is implemented based on Python simulation. Simulation Results showed that the proposed protocol provides better Quality of Service in comparison with Weighted Low Energy Cluster Hierarchy (W-LEACH) protocol by 63%.
Optimized Link State Routing Protocol (OLSR) is an efficient routing protocol used for various Ad hoc networks. OLSR employs the Multipoint Relay (MPR) technique to reduce network overhead traffic. A mobility model's main goal is to realistically simulate the movement behaviors of actual users. However, the high mobility and mobility model is the major design issues for an efficient and effective routing protocol for real Mobile Ad hoc Networks (MANETs). Therefore, this paper aims to analyze the performance of the OLSR protocol concerning various random and group mobility models. Two simulation scenarios were conducted over four mobility models, specifically the Random Waypoint model (RWP), Random Direction model (RD), Nomadic Co
... Show MoreIn this paper, a simple fast lossless image compression method is introduced for compressing medical images, it is based on integrates multiresolution coding along with polynomial approximation of linear based to decompose image signal followed by efficient coding. The test results indicate that the suggested method can lead to promising performance due to flexibility in overcoming the limitations or restrictions of the model order length and extra overhead information required compared to traditional predictive coding techniques.
Porosity plays an essential role in petroleum engineering. It controls fluid storage in aquifers, connectivity of the pore structure control fluid flow through reservoir formations. To quantify the relationships between porosity, storage, transport and rock properties, however, the pore structure must be measured and quantitatively described. Porosity estimation of digital image utilizing image processing essential for the reservoir rock analysis since the sample 2D porosity briefly described. The regular procedure utilizes the binarization process, which uses the pixel value threshold to convert the color and grayscale images to binary images. The idea is to accommodate the blue regions entirely with pores and transform it to white in r
... Show MoreImage classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreToday’s modern medical imaging research faces the challenge of detecting brain tumor through Magnetic Resonance Images (MRI). Normally, to produce images of soft tissue of human body, MRI images are used by experts. It is used for analysis of human organs to replace surgery. For brain tumor detection, image segmentation is required. For this purpose, the brain is partitioned into two distinct regions. This is considered to be one of the most important but difficult part of the process of detecting brain tumor. Hence, it is highly necessary that segmentation of the MRI images must be done accurately before asking the computer to do the exact diagnosis. Earlier, a variety of algorithms were developed for segmentation of MRI images by usin
... Show More