This paper presents a new algorithm in an important research field which is the semantic word similarity estimation. A new feature-based algorithm is proposed for measuring the word semantic similarity for the Arabic language. It is a highly systematic language where its words exhibit elegant and rigorous logic. The score of sematic similarity between two Arabic words is calculated as a function of their common and total taxonomical features. An Arabic knowledge source is employed for extracting the taxonomical features as a set of all concepts that subsumed the concepts containing the compared words. The previously developed Arabic word benchmark datasets are used for optimizing and evaluating the proposed algorithm. In this paper, the performance of the new feature-based algorithm is compared against the performance of seven ontology-based algorithms adapted to Arabic. The results of the evaluation and comparison experiments show that the new proposed algorithm outperforms the adapted word similarity algorithms on the Arabic word benchmark dataset. The proposed algorithm will be included in the AWN-similarity which is free open-source software for Arabic.
tock markets changed up and down during time. Some companies’ affect others due to dependency on each other . In this work, the network model of the stock market is discribed as a complete weighted graph. This paper aims to investigate the Iraqi stock markets using graph theory tools. The vertices of this graph correspond to the Iraqi markets companies, and the weights of the edges are set ulrametric distance of minimum spanning tree.
The efficiency of the Honeywords approach has been proven to be a significant tool for boosting password security. The suggested system utilizes the Meerkat Clan Algorithm (MCA) in conjunction with WordNet to produce honeywords, thereby enhancing the level of password security. The technique of generating honeywords involves data sources from WordNet, which contributes to the improvement of authenticity and diversity in the honeywords. The method encompasses a series of consecutive stages, which include the tokenization of passwords, the formation of alphabet tokens using the Meerkat Clan Algorithm (MCA), the handling of digit tokens, the creation of unique character tokens, and the consolidation of honeywords. The optimization of t
... Show MoreFacial emotion recognition finds many real applications in the daily life like human robot interaction, eLearning, healthcare, customer services etc. The task of facial emotion recognition is not easy due to the difficulty in determining the effective feature set that can recognize the emotion conveyed within the facial expression accurately. Graph mining techniques are exploited in this paper to solve facial emotion recognition problem. After determining positions of facial landmarks in face region, twelve different graphs are constructed using four facial components to serve as a source for sub-graphs mining stage using gSpan algorithm. In each group, the discriminative set of sub-graphs are selected and fed to Deep Belief Network (DBN) f
... Show MoreInternet paths sharing the same congested link can be identified using several shared congestion detection techniques. The new detection technique which is proposed in this paper depends on the previous novel technique (delay correlation with wavelet denoising (DCW) with new denoising method called Discrete Multiwavelet Transform (DMWT) as signal denoising to separate between queuing delay caused by network congestion and delay caused by various other delay variations. The new detection technique provides faster convergence (3 to 5 seconds less than previous novel technique) while using fewer probe packets approximately half numbers than the previous novel technique, so it will reduce the overload on the network caused by probe packets.
... Show MoreColor image compression is a good way to encode digital images by decreasing the number of bits wanted to supply the image. The main objective is to reduce storage space, reduce transportation costs and maintain good quality. In current research work, a simple effective methodology is proposed for the purpose of compressing color art digital images and obtaining a low bit rate by compressing the matrix resulting from the scalar quantization process (reducing the number of bits from 24 to 8 bits) using displacement coding and then compressing the remainder using the Mabel ZF algorithm Welch LZW. The proposed methodology maintains the quality of the reconstructed image. Macroscopic and
<p>Generally, The sending process of secret information via the transmission channel or any carrier medium is not secured. For this reason, the techniques of information hiding are needed. Therefore, steganography must take place before transmission. To embed a secret message at optimal positions of the cover image under spatial domain, using the developed particle swarm optimization algorithm (Dev.-PSO) to do that purpose in this paper based on Least Significant Bits (LSB) using LSB substitution. The main aim of (Dev. -PSO) algorithm is determining an optimal paths to reach a required goals in the specified search space based on disposal of them, using (Dev.-PSO) algorithm produces the paths of a required goals with most effi
... Show MoreB3LYP/6-31G, DFT method was applied to hypothetical study the design of six carbon nanotube materials based on [8]circulene, through the use of cyclic polymerization of two and three molecules of [8]circulene. Optimized structures of [8]circulene have saddle-shaped. Design of six carbon nanotubes reactions were done by thermodynamically calculating (Δ S, Δ G and Δ H) and the stability of these hypothetical nanotubes depending on the value of HOMO energy level. Nanotubes obtained have the most efficient gap energy, making them potentially useful for solar cell applications.
Image databases are increasing exponentially because of rapid developments in social networking and digital technologies. To search these databases, an efficient search technique is required. CBIR is considered one of these techniques. This paper presents a multistage CBIR to address the computational cost issues while reasonably preserving accuracy. In the presented work, the first stage acts as a filter that passes images to the next stage based on SKTP, which is the first time used in the CBIR domain. While in the second stage, LBP and Canny edge detectors are employed for extracting texture and shape features from the query image and images in the newly constructed database. The p