Estimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that represents the relationship between the compared texts and extracts the degree of similarity between them. Representing a text as a semantic network is the best knowledge representation that comes close to the human mind's understanding of the texts, where the semantic network reflects the sentence's semantic, syntactical, and structural knowledge. The network representation is a visual representation of knowledge objects, their qualities, and their relationships. WordNet lexical database has been used as a knowledge-based source while the GloVe pre-trained word embedding vectors have been used as a corpus-based source. The proposed method was tested using three different datasets, DSCS, SICK, and MOHLER datasets. A good result has been obtained in terms of RMSE and MAE.
Abstract
The aim of the current research is to extract the psychometric properties of Philip Carter's tests (for mental agility) according to the classical measurement theory. To achieve these goals, the researcher took a number of scientific steps to analyze Philip Carter's tests (for mental agility) according to the classical measurement theory. The researcher translated Philip Carter's (mental agility) tests from English into Arabic and then he translated them conversely. For the purpose of statistical analysis of paragraphs of the Philip Carter tests (mental agility) to extract the psychometric properties, the tests were applied to a sample of (1000) male and female students who were selected by cluster sampl
... Show MoreProgression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is amon
... Show MoreThe present work determines the particle size based only on the number of tracks detected in a cluster created by a hot particle on the CR-39 solid state nuclear track detector and depending on the exposure time. The mathematical model of the cross section developed here gives the relationship between alpha particle emitting from the (n, α) reaction and the number of tracks created and distribution of tracks created on the surface of the track detector. In an experiment performed during this work, disc of boron compound (boric acid or sodium tetraborate) of different weights were prepared and exposed to thermal neutron from the source. Chemical etching is processes of path formation in the detector, during which a suitable etching solut
... Show MoreThe principal forms of radiation dosage for humans from spontaneous radiation material are being recognized as radon and its progenitors in the interior environment. Radiation-related health risks are caused by radon in water supply, which can be inhaled or ingested. Materials and Methods: The solid-state CR-39 nuclear trace detectors method was using in this research for measuring accumulation of radioactivity in water supply in different locations of Iraq's southwest corner of Baghdad. In Baghdad district, 42 samples were selected from 14 regions (3 samples out of each region) and put in dosimeters for 50 days. Results: The mean radon concentration was 49.75 Bq/m3, that is lower than the internationally recognized limit of 1100 Bq /m3. Th
... Show MoreDeep submicron technologies continue to develop according to Moore’s law allowing hundreds of processing elements and memory modules to be integrated on a single chip forming multi/many-processor systems-on-chip (MPSoCs). Network on chip (NoC) arose as an interconnection for this large number of processing modules. However, the aggressive scaling of transistors makes NoC more vulnerable to both permanent and transient faults. Permanent faults persistently affect the circuit functionality from the time of their occurrence. The router represents the heart of the NoC. Thus, this research focuses on tolerating permanent faults in the router’s input buffer component, particularly the virtual channel state fields. These fields track packets f
... Show MoreToday in the digital realm, where images constitute the massive resource of the social media base but unfortunately suffer from two issues of size and transmission, compression is the ideal solution. Pixel base techniques are one of the modern spatially optimized modeling techniques of deterministic and probabilistic bases that imply mean, index, and residual. This paper introduces adaptive pixel-based coding techniques for the probabilistic part of a lossy scheme by incorporating the MMSA of the C321 base along with the utilization of the deterministic part losslessly. The tested results achieved higher size reduction performance compared to the traditional pixel-based techniques and the standard JPEG by about 40% and 50%,
... Show MoreThis study investigates the feasibility of a mobile robot navigating and discovering its location in unknown environments, followed by the creation of maps of these navigated environments for future use. First, a real mobile robot named TurtleBot3 Burger was used to achieve the simultaneous localization and mapping (SLAM) technique for a complex environment with 12 obstacles of different sizes based on the Rviz library, which is built on the robot operating system (ROS) booted in Linux. It is possible to control the robot and perform this process remotely by using an Amazon Elastic Compute Cloud (Amazon EC2) instance service. Then, the map to the Amazon Simple Storage Service (Amazon S3) cloud was uploaded. This provides a database
... Show More