Estimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that represents the relationship between the compared texts and extracts the degree of similarity between them. Representing a text as a semantic network is the best knowledge representation that comes close to the human mind's understanding of the texts, where the semantic network reflects the sentence's semantic, syntactical, and structural knowledge. The network representation is a visual representation of knowledge objects, their qualities, and their relationships. WordNet lexical database has been used as a knowledge-based source while the GloVe pre-trained word embedding vectors have been used as a corpus-based source. The proposed method was tested using three different datasets, DSCS, SICK, and MOHLER datasets. A good result has been obtained in terms of RMSE and MAE.
This paper compare the accurecy of HF propagation prediction programs for HF circuits links between Iraq and different points world wide during August 2018 when solar cycle 24 (start 2009 end 2020) is at minimun activity and also find out the best communication mode used. The prediction programs like Voice of America Coverage Analysis Program (VOACAP) and ITU Recommendation RS 533 (REC533 ) had been used to generat HF circuit link parameters like Maximum Usable Frequency ( MUF) and Frequency of Transsmision (FOT) .Depending on the predicted parameters (data) , real radio contacts had been done using a radio transceiver from Icom model IC 7100 with 100W RF
... Show MoreAbstract:
The research aims to define the theoretical framework for corporate governance and its mechanisms and shed light on corporate governance in Iraq as well as the theoretical framework for the quality of financial reports and their relationship and the role of corporate governance in activating them. A commercial bank as a sample for the research, and a survey list was prepared to show the extent to which the research sample banks are committed to applying internal governance mechanisms. imposed on them by the local environment, which leads to improving the quality of financial reports for these banks.
A true random TTL pulse generator was implemented and investigated for quantum key distribution systems. The random TTL signals are generated by low cost components available in the local markets. The TTL signals are obtained by using true random binary sequences based on registering photon arrival time difference registered in coincidence windows between two single – photon detectors. The true random TTL pulse generator performance was tested by using time to digital converters which gives accurate readings for photon arrival time. The proposed true random pulse TTL generator can be used in any quantum -key distribution system for random operation of the transmitters for these systems
Summary Search Alachtgalat process semantic encoding carried out by the actor inside the theater to deliver the intellectual sense, and social, as well as the aesthetic to the recipient, the fact that the code is the function is operated relationship in accordance with the prior intent to deliver the implications of a particular intent, too, and the fact that encryption is the best way to transfer messages in system theater, where should the actor that has the ability to create images (sensory) or intellectual new in human consciousness on the basis of conversion of impressions collected from reality and re technically formed inside the theater, allowing the recipient to arrange these marks to obtain the meaning that lies in them. So he
... Show MoreDeep submicron technologies continue to develop according to Moore’s law allowing hundreds of processing elements and memory modules to be integrated on a single chip forming multi/many-processor systems-on-chip (MPSoCs). Network on chip (NoC) arose as an interconnection for this large number of processing modules. However, the aggressive scaling of transistors makes NoC more vulnerable to both permanent and transient faults. Permanent faults persistently affect the circuit functionality from the time of their occurrence. The router represents the heart of the NoC. Thus, this research focuses on tolerating permanent faults in the router’s input buffer component, particularly the virtual channel state fields. These fields track packets f
... Show MoreProgression in Computer networks and emerging of new technologies in this field helps to find out new protocols and frameworks that provides new computer network-based services. E-government services, a modernized version of conventional government, are created through the steady evolution of technology in addition to the growing need of societies for numerous services. Government services are deeply related to citizens’ daily lives; therefore, it is important to evolve with technological developments—it is necessary to move from the traditional methods of managing government work to cutting-edge technical approaches that improve the effectiveness of government systems for providing services to citizens. Blockchain technology is amon
... Show MoreThe widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the ch
... Show More
An automatic text summarization system mimics how humans summarize by picking the most significant sentences in a source text. However, the complexities of the Arabic language have become challenging to obtain information quickly and effectively. The main disadvantage of the traditional approaches is that they are strictly constrained (especially for the Arabic language) by the accuracy of sentence feature functions, weighting schemes, and similarity calculations. On the other hand, the meta-heuristic search approaches have a feature tha
... Show MoreThe text has many connotations in the Arabic language, such as vowel points, designation, completion, etc., and the original meaning of the text is to show. The Western text has its owen independent semantic unit .The biblical texts are a mixture of what was reported by the Prophet Moses (peace be upon him) and what the authors described in terms of texts over many centuries.The meaning of the text is guidance and payment, and it is a natural connotation. The religious text for Muslims is divided into peremptory texts that are national proof. The evidence for the meaning of the text is proven by language, and it is not required that the researcher be a jurist. The approach is a factual questionnaire by the researcher according to a speci
... Show More