There is a great deal of systems dealing with image processing that are being used and developed on a daily basis. Those systems need the deployment of some basic operations such as detecting the Regions of Interest and matching those regions, in addition to the description of their properties. Those operations play a significant role in decision making which is necessary for the next operations depending on the assigned task. In order to accomplish those tasks, various algorithms have been introduced throughout years. One of the most popular algorithms is the Scale Invariant Feature Transform (SIFT). The efficiency of this algorithm is its performance in the process of detection and property description, and that is due to the fact that it operates on a big number of key-points, the only drawback it has is that it is rather time consuming. In the suggested approach, the system deploys SIFT to perform its basic tasks of matching and description is focused on minimizing the number of key-points which is performed via applying Fast Approximate Nearest Neighbor algorithm, which will reduce the redundancy of matching leading to speeding up the process. The proposed application has been evaluated in terms of two criteria which are time and accuracy, and has accomplished a percentage of accuracy of up to 100%, in addition to speeding up the processes of matching and description.
In this study, different methods were used for estimating location parameter and scale parameter for extreme value distribution, such as maximum likelihood estimation (MLE) , method of moment estimation (ME),and approximation estimators based on percentiles which is called white method in estimation, as the extreme value distribution is one of exponential distributions. Least squares estimation (OLS) was used, weighted least squares estimation (WLS), ridge regression estimation (Rig), and adjusted ridge regression estimation (ARig) were used. Two parameters for expected value to the percentile as estimation for distribution f
... Show MoreHeart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show MoreTime represented a significant element in building any film story, despite its inability to express itself, but by employing the rest of the elements of the cinematic mediator language to express it. Time factor is present and manifested in all the details of the picture, and the more important is its presence in the event narration process. The narration totally depends on temporal structure in which it appears, which makes time a dominating element in the development of the narrative shapes and patterns. The narrative propositions have come to take new workings that time streams appeared that manipulate the time structure, reversing it, stopping it or making it fluctuate between the three levels of time, or repeating it or make
... Show MoreThe Internet of Things (IoT) has significantly transformed modern systems through extensive connectivity but has also concurrently introduced considerable cybersecurity risks. Traditional rule-based methods are becoming increasingly insufficient in the face of evolving cyber threats. This study proposes an enhanced methodology utilizing a hybrid machine-learning framework for IoT cyber-attack detection. The framework integrates a Grey Wolf Optimizer (GWO) for optimal feature selection, a customized synthetic minority oversampling technique (SMOTE) for data balancing, and a systematic approach to hyperparameter tuning of ensemble algorithms: Random Forest (RF), XGBoost, and CatBoost. Evaluations on the RT-IoT2022 dataset demonstrat
... Show MoreIn this paper, a mathematical model for the oxidative desulfurization of kerosene had been developed. The mathematical model and simulation process is a very important process due to it provides a better understanding of a real process. The mathematical model in this study was based on experimental results which were taken from literature to calculate the optimal kinetic parameters where simulation and optimization were conducted using gPROMS software. The optimal kinetic parameters were Activation energy 18.63958 kJ/mol, Pre-exponential factor 2201.34 (wt)-0.76636. min-1 and the reaction order 1.76636. These optimal kinetic parameters were used to find the optimal reaction conditions which
... Show MoreThe software-defined network (SDN) is a new technology that separates the control plane from data plane for the network devices. One of the most significant issues in the video surveillance system is the link failure. When the path failure occurs, the monitoring center cannot receive the video from the cameras. In this paper, two methods are proposed to solve this problem. The first method uses the Dijkstra algorithm to re-find the path at the source node switch. The second method uses the Dijkstra algorithm to re-find the path at the ingress node switch (or failed link).
... Show MoreA three-stage learning algorithm for deep multilayer perceptron (DMLP) with effective weight initialisation based on sparse auto-encoder is proposed in this paper, which aims to overcome difficulties in training deep neural networks with limited training data in high-dimensional feature space. At the first stage, unsupervised learning is adopted using sparse auto-encoder to obtain the initial weights of the feature extraction layers of the DMLP. At the second stage, error back-propagation is used to train the DMLP by fixing the weights obtained at the first stage for its feature extraction layers. At the third stage, all the weights of the DMLP obtained at the second stage are refined by error back-propagation. Network structures an
... Show MoreHoneywords are fake passwords that serve as an accompaniment to the real password, which is called a “sugarword.” The honeyword system is an effective password cracking detection system designed to easily detect password cracking in order to improve the security of hashed passwords. For every user, the password file of the honeyword system will have one real hashed password accompanied by numerous fake hashed passwords. If an intruder steals the password file from the system and successfully cracks the passwords while attempting to log in to users’ accounts, the honeyword system will detect this attempt through the honeychecker. A honeychecker is an auxiliary server that distinguishes the real password from the fake passwords and t
... Show More