In light of the development in computer science and modern technologies, the impersonation crime rate has increased. Consequently, face recognition technology and biometric systems have been employed for security purposes in a variety of applications including human-computer interaction, surveillance systems, etc. Building an advanced sophisticated model to tackle impersonation-related crimes is essential. This study proposes classification Machine Learning (ML) and Deep Learning (DL) models, utilizing Viola-Jones, Linear Discriminant Analysis (LDA), Mutual Information (MI), and Analysis of Variance (ANOVA) techniques. The two proposed facial classification systems are J48 with LDA feature extraction method as input, and a one-dimensional Convolutional Neural Network Hybrid Model (1D-CNNHM). The MUCT database was considered for training and evaluation. The performance, in terms of classification, of the J48 model reached 96.01% accuracy whereas the DL model that merged LDA with MI and ANOVA reached 100% accuracy. Comparing the proposed models with other works reflects that they are performing very well, with high accuracy and low processing time.
In this paper, the effect size measures was discussed, which are useful in many estimation processes for direct effect and its relation with indirect and total effects. In addition, an algorithm to calculate the suggested measure of effect size was suggested that represent the ratio of direct effect to the effect of the estimated parameter using the Regression equation of the dependent variable on the mediator variable without using the independent variable in the model. Where this an algorithm clear the possibility to use this regression equation in Mediation Analysis, where usually used the Mediator and independent variable together when the dependent variable regresses on them. Also this an algorithm to show how effect of the
... Show MoreThis research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreAutorías: Mariam Liwa Abdel Fattah, Liqaa Abdullah Ali. Localización: Revista iberoamericana de psicología del ejercicio y el deporte. Nº. 4, 2023. Artículo de Revista en Dialnet.
Abstract
Rayleigh distribution is one of the important distributions used for analysis life time data, and has applications in reliability study and physical interpretations. This paper introduces four different methods to estimate the scale parameter, and also estimate reliability function; these methods are Maximum Likelihood, and Bayes and Modified Bayes, and Minimax estimator under squared error loss function, for the scale and reliability function of the generalized Rayleigh distribution are obtained. The comparison is done through simulation procedure, t
... Show MoreClassifying an overlapping object is one of the main challenges faced by researchers who work in object detection and recognition. Most of the available algorithms that have been developed are only able to classify or recognize objects which are either individually separated from each other or a single object in a scene(s), but not overlapping kitchen utensil objects. In this project, Faster R-CNN and YOLOv5 algorithms were proposed to detect and classify an overlapping object in a kitchen area. The YOLOv5 and Faster R-CNN were applied to overlapping objects where the filter or kernel that are expected to be able to separate the overlapping object in the dedicated layer of applying models. A kitchen utensil benchmark image database and
... Show MoreAbstract
The research aimed to prepare an audit program focusing on the activities of municipal institutions related to the environmental dimension as one of the dimensions of sustainable development, and applying the program for the purpose of preparing an oversight report related to assessing the impact of the activities of municipal institutions on the environmental reality as the main channel through which municipal institutions contribute to achieving the part related to it. Among the requirements of sustainable development, the proposed program was prepared and applied to the institutions affiliated to the Directorate of Mu
... Show MoreThe region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled
... Show MoreThis research deals with a very important subject as it tries to change the theoretical and scientific heritage and some professional rules adopted in the newsroom. Most media students have difficulties in writing news for press correctly. The researcher tries to identify the compatibility of what is published in local news agencies with professional and academic standards.
The research finds detailed editorial rules for a number of news formats which will play an important role in writing news for press easily, especially for the beginners and newcomers. Also, it discovers a new fact denying the beliefs of some researchers and writers in not having news conclusion in news edited according to the inverted pyramid pattern.
The re
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show More