An image retrieval system is a computer system for browsing, looking and recovering pictures from a huge database of advanced pictures. The objective of Content-Based Image Retrieval (CBIR) methods is essentially to extract, from large (image) databases, a specified number of images similar in visual and semantic content to a so-called query image. The researchers were developing a new mechanism to retrieval systems which is mainly based on two procedures. The first procedure relies on extract the statistical feature of both original, traditional image by using the histogram and statistical characteristics (mean, standard deviation). The second procedure relies on the T- test to measure the independence between more than images, (coefficient of correlate, T- test, Level of significance, find the decision), and, through experimental test, it was found that this proposed method of retrieval technique is powerful than the classical retrieval System.
The general trend in Iraqi banks is focused towards the application of international financial reporting standards, especially the international financial reporting standard IFRS 9 “Financial Instruments”, in addition to the directives issued on the Central Bank of Iraq’s instructions for the year 2018 regarding the development of expected credit losses models, and not to adhere to a specific method for calculating these losses and authorizing the banks’ departments to adopt the method of calculating losses that suits the nature of the bank’s activity and to be consistent in its use from time to time. The research problem revolves around the different methodologies for calculatin
... Show MoreSupport Vector Machines (SVMs) are supervised learning models used to examine data sets in order to classify or predict dependent variables. SVM is typically used for classification by determining the best hyperplane between two classes. However, working with huge datasets can lead to a number of problems, including time-consuming and inefficient solutions. This research updates the SVM by employing a stochastic gradient descent method. The new approach, the extended stochastic gradient descent SVM (ESGD-SVM), was tested on two simulation datasets. The proposed method was compared with other classification approaches such as logistic regression, naive model, K Nearest Neighbors and Random Forest. The results show that the ESGD-SVM has a
... Show MoreDust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show MoreCoumarins have been recognized as anticancer competitors. HDACis are one of the interesting issues in the field of antitumor research. In order to achieve an increased anticancer efficacy, a series of hybrid compounds bearing coumarin scaffolds have been designed and synthesized as novel HDACis, In this review we present a series of novel HDAC inhibitors comprising coumarin as a core e of cap group of HDAC inhibitors that have been designed, synthesized and assessed for their enzyme inhibitory activity as well as antiproliferative activity. Most of them exhibited potent HDAC inhibitory activity and significant cytotoxicity
Localization is an essential issue in pervasive computing application. FM performs worse in some indoor environment when its structure is same to some extent the outdoor environment like shopping mall. Furthermore, FM signal are less varied over time, low power consumption and less effected by human and small object presence when it compared to Wi-Fi. Consequently, this paper focuses on FM radio signal technique and its characteristics that make it suitable to be used for indoor localization, its benefits, areas of applications and limitations.
The palm vein recognition is one of the biometric systems that use for identification and verification processes since each person have unique characteristics for the veins. In this paper we can improvement palm vein recognition system have been made. The system based on centerline extraction of veins, and employs the concept of Difference-of Gaussian (DoG) Function to construct features vector. The tests results on our database showed that the identification rate is 100 % with the minimum error rate was 0.333.
Watermarking operation can be defined as a process of embedding special wanted and reversible information in important secure files to protect the ownership or information of the wanted cover file based on the proposed singular value decomposition (SVD) watermark. The proposed method for digital watermark has very huge domain for constructing final number and this mean protecting watermark from conflict. The cover file is the important image need to be protected. A hidden watermark is a unique number extracted from the cover file by performing proposed related and successive operations, starting by dividing the original image into four various parts with unequal size. Each part of these four treated as a separate matrix and applying SVD
... Show MoreIn current generation of technology, a robust security system is required based on biometric trait such as human gait, which is a smooth biometric feature to understand humans via their taking walks pattern. In this paper, a person is recognized based on his gait's style that is captured from a video motion previously recorded with a digital camera. The video package is handled via more than one phase after splitting it into a successive image (called frames), which are passes through a preprocessing step earlier than classification procedure operation. The pre-processing steps encompass converting each image into a gray image, cast off all undesirable components and ridding it from noise, discover differen
... Show MoreThis paper proposes a new password generation technique on the basis of mouse motion and a special case location recognized by the number of clicks to protect sensitive data for different companies. Two, three special locations click points for the users has been proposed to increase password complexity. Unlike other currently available random password generators, the path and number of clicks will be added by admin, and authorized users have to be training on it.
This method aims to increase combinations for the graphical password generation using mouse motion for a limited number of users. A mathematical model is developed to calculate the performance