Estimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that represents the relationship between the compared texts and extracts the degree of similarity between them. Representing a text as a semantic network is the best knowledge representation that comes close to the human mind's understanding of the texts, where the semantic network reflects the sentence's semantic, syntactical, and structural knowledge. The network representation is a visual representation of knowledge objects, their qualities, and their relationships. WordNet lexical database has been used as a knowledge-based source while the GloVe pre-trained word embedding vectors have been used as a corpus-based source. The proposed method was tested using three different datasets, DSCS, SICK, and MOHLER datasets. A good result has been obtained in terms of RMSE and MAE.
This work includes design, implementation and testing of a microcontroller – based spectrum analyzer system. Both hardware and software structures are built to verify the main functions that are required by such system. Their design utilizes the permissible and available tools to achieve the main functions of the system in such a way to be modularly permitting any adaptation for a specific changing in the application environment. The analysis technique, mainly, depends on the Fourier analysis based methods of spectral analysis with the necessary required preconditioning processes. The software required for waveform analysis has been prepared. The spectrum of the waveform has been displayed, and the instrument accuracy has been checked.
... Show MoreCipher security is becoming an important step when transmitting important information through networks. The algorithms of cryptography play major roles in providing security and avoiding hacker attacks. In this work two hybrid cryptosystems have been proposed, that combine a modification of the symmetric cryptosystem Playfair cipher called the modified Playfair cipher and two modifications of the asymmetric cryptosystem RSA called the square of RSA technique and the square RSA with Chinese remainder theorem technique. The proposed hybrid cryptosystems have two layers of encryption and decryption. In the first layer the plaintext is encrypted using modified Playfair to get the cipher text, this cipher text will be encrypted using squared
... Show More: The Aluminium (Al) material emerged as a plasmonic material in the wavelength ranges from the ultraviolet to the visible bands in different on-chip plasmonic applications. In this paper, we demonstrate the effect of using Al on the electromagnetic (EM) field distribution of a compact hybrid plasmonic waveguide (HPW) acting as a polarization rotator. We compare the performance of Al with other familiar metals that are widely used as plasmonic materials, which are Silver (Ag) and Gold (Au). Furthermore, we study the effect of reducing the geometrical dimensions of the used materials on the EM field distributions inside the HPW and, consequently, on the efficiency of the polarization rotation. We perform the study based o
... Show MoreIn the current worldwide health crisis produced by coronavirus disease (COVID-19), researchers and medical specialists began looking for new ways to tackle the epidemic. According to recent studies, Machine Learning (ML) has been effectively deployed in the health sector. Medical imaging sources (radiography and computed tomography) have aided in the development of artificial intelligence(AI) strategies to tackle the coronavirus outbreak. As a result, a classical machine learning approach for coronavirus detection from Computerized Tomography (CT) images was developed. In this study, the convolutional neural network (CNN) model for feature extraction and support vector machine (SVM) for the classification of axial
... Show MoreMany financial institutions invest their surplus funds in stocks, either to obtain dividends or for trading purposes and to obtain profits from the difference between the cost and the selling price, and investment in shares represents an important part of the financial position of financial institutions applying to the common accounting system of banks and insurance companies, in addition to their impact It is clear on the result of the activity of these institutions.The aim of the research is to define what the shares and their types are, and to indicate the accounting treatments needed to move towards the process of adopting the International Financial Reporting Standard No. (9) and its reflection on its financial statements. I
... Show MoreExchange of information through the channels of communication can be unsafe. Communication media are not safe to send sensitive information so it is necessary to provide the protection of information from disclosure to unauthorized persons. This research presented the method to information security is done through information hiding into the cover image using a least significant bit (LSB) technique, where a text file is encrypted using a secret sharing scheme. Then, generating positions to hiding information in a random manner of cover image, which is difficult to predict hiding in the image-by-image analysis or statistical analyzes. Where it provides two levels of information security through encryption of a text file using the secret sha
... Show MoreThe data communication has been growing in present day. Therefore, the data encryption became very essential in secured data transmission and storage and protecting data contents from intruder and unauthorized persons. In this paper, a fast technique for text encryption depending on genetic algorithm is presented. The encryption approach is achieved by the genetic operators Crossover and mutation. The encryption proposal technique based on dividing the plain text characters into pairs, and applying the crossover operation between them, followed by the mutation operation to get the encrypted text. The experimental results show that the proposal provides an important improvement in encryption rate with comparatively high-speed Process
... Show MoreThe technological development in the field of information and communication has been accompanied by the emergence of security challenges related to the transmission of information. Encryption is a good solution. An encryption process is one of the traditional methods to protect the plain text, by converting it into inarticulate form. Encryption implemented can be occurred by using some substitute techniques, shifting techniques, or mathematical operations. This paper proposed a method with two branches to encrypt text. The first branch is a new mathematical model to create and exchange keys, the proposed key exchange method is the development of Diffie-Hellman. It is a new mathematical operations model to exchange keys based on prime num
... Show MoreBuilding a system to identify individuals through their speech recording can find its application in diverse areas, such as telephone shopping, voice mail and security control. However, building such systems is a tricky task because of the vast range of differences in the human voice. Thus, selecting strong features becomes very crucial for the recognition system. Therefore, a speaker recognition system based on new spin-image descriptors (SISR) is proposed in this paper. In the proposed system, circular windows (spins) are extracted from the frequency domain of the spectrogram image of the sound, and then a run length matrix is built for each spin, to work as a base for feature extraction tasks. Five different descriptors are generated fro
... Show More