A hand gesture recognition system provides a robust and innovative solution to nonverbal communication through human–computer interaction. Deep learning models have excellent potential for usage in recognition applications. To overcome related issues, most previous studies have proposed new model architectures or have fine-tuned pre-trained models. Furthermore, these studies relied on one standard dataset for both training and testing. Thus, the accuracy of these studies is reasonable. Unlike these works, the current study investigates two deep learning models with intermediate layers to recognize static hand gesture images. Both models were tested on different datasets, adjusted to suit the dataset, and then trained under different methods. First, the models were initialized with random weights and trained from scratch. Afterward, the pre-trained models were examined as feature extractors. Finally, the pre-trained models were fine-tuned with intermediate layers. Fine-tuning was conducted on three levels: the fifth, fourth, and third blocks, respectively. The models were evaluated through recognition experiments using hand gesture images in the Arabic sign language acquired under different conditions. This study also provides a new hand gesture image dataset used in these experiments, plus two other datasets. The experimental results indicated that the proposed models can be used with intermediate layers to recognize hand gesture images. Furthermore, the analysis of the results showed that fine-tuning the fifth and fourth blocks of these two models achieved the best accuracy results. In particular, the testing accuracies on the three datasets were 96.51%, 72.65%, and 55.62% when fine-tuning the fourth block and 96.50%, 67.03%, and 61.09% when fine-tuning the fifth block for the first model. The testing accuracy for the second model showed approximately similar results.
Entropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show MoreTo damp the low-frequency oscillations which occurred due to the disturbances in the electrical power system, the generators are equipped with Power System Stabilizer (PSS) that provide supplementary feedback stabilizing signals. The low-frequency oscillations in power system are classified as local mode oscillations, intra-area mode oscillation, and interarea mode oscillations. Double input multiband Power system stabilizers (PSSs) were used to damp out low-frequency oscillations in power system. Among dual-input PSSs, PSS4B offers superior transient performance. Power system simulator for engineering (PSS/E) software was adopted to test and evaluate the dynamic performance of PSS4B model on Iraqi national grid. The results showed
... Show MoreThe main challenge is to protect the environment from future deterioration due to pollution and the lack of natural resources. Therefore, one of the most important things to pay attention to and get rid of its negative impact is solid waste. Solid waste is a double-edged sword according to the way it is dealt with, as neglecting it causes a serious environmental risk from water, air and soil pollution, while dealing with it in the right way makes it an important resource in preserving the environment. Accordingly, the proper management of solid waste and its reuse or recycling is the most important factor. Therefore, attention has been drawn to the use of solid waste in different ways, and the most common way is to use it as an alternative
... Show MoreData Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show MoreE-Learning packages are content and instructional methods delivered on a computer
(whether on the Internet, or an intranet), and designed to build knowledge and skills related to
individual or organizational goals. This definition addresses: The what: Training delivered
in digital form. The how: By content and instructional methods, to help learn the content.
The why: Improve organizational performance by building job-relevant knowledge and
skills in workers.
This paper has been designed and implemented a learning package for Prolog Programming
Language. This is done by using Visual Basic.Net programming language 2010 in
conjunction with the Microsoft Office Access 2007. Also this package introduces several
fac
This study proposed control system that has been presented to control the electron lens resistance in order to obtain a stabilized electron lens power. This study will layout the fundamental challenges, hypothetical plan arrangements and development condition for the Integrable Optics Test Accelerator (IOTA) in progress at Fermilab. Thus, an effective automatic gain control (AGC) unit has been introduced which prevents fluctuations in the internal resistance of the electronic lens caused by environmental influences to affect the system's current and power values and keep them in stable amounts. Utilizing this unit has obtained level balanced out system un impacted with electronic lens surrounding natural varieties.
Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show More