The widespread of internet allover the world, in addition to the increasing of the huge number of users that they exchanged important information over it highlights the need for a new methods to protect these important information from intruders' corruption or modification. This paper suggests a new method that ensures that the texts of a given document cannot be modified by the intruders. This method mainly consists of mixture of three steps. The first step which barrows some concepts of "Quran" security system to detect some type of change(s) occur in a given text. Where a key of each paragraph in the text is extracted from a group of letters in that paragraph which occur as multiply of a given prime number. This step cannot detect the changes in the reordering letters or words of the paragraph and text without changing the letters themselves. So, the next step uses one of the error detection methods which is named Hamming Codes to find out the locations of the change in the received text. After that; at the third step, RSA as a primary encryption method has been used to encrypt the keys extracted to the first step and second step in order to prevent the intruders to break down the security of the method.
Reading Article First of Section 25 of Penal Trials Principles Law of Internal Forces Law makes us say that there are personal and objective regulations governing presence of jurisdiction of internal forces courts. It should be a source of law embodied by the personality of police as law in essence is a law of persons and the character of the perpetrator is not sufficient to determine jurisdiction of these courts. It should be linked with the subject condition, which completes the presence of jurisdiction for these courts, focusing on the essence of the crime and the interest to be protected by the criminal text with respect with the link between the crime and being a policeman. Thes
... Show MoreThe huge amount of documents in the internet led to the rapid need of text classification (TC). TC is used to organize these text documents. In this research paper, a new model is based on Extreme Machine learning (EML) is used. The proposed model consists of many phases including: preprocessing, feature extraction, Multiple Linear Regression (MLR) and ELM. The basic idea of the proposed model is built upon the calculation of feature weights by using MLR. These feature weights with the extracted features introduced as an input to the ELM that produced weighted Extreme Learning Machine (WELM). The results showed a great competence of the proposed WELM compared to the ELM.
With a great diversity in the curriculum contemporary monetary and visions, and development that hit the graphic design field, it has become imperative for the workers in the contemporary design research and investigation in accordance with the intellectual treatises and methods of modern criticism, because the work design requires the designer and recipient both know the mechanics of tibographic text analysis in a heavy world of texts and images varied vocabulary and graphics, and designer on before anyone else manages the process of analysis to know what you offer others of shipments visual often of oriented intended from behind, what is meant, in the midst of this world, the curriculum Alsemiae directly overlap with such diverse offer
... Show MoreBuilding a system to identify individuals through their speech recording can find its application in diverse areas, such as telephone shopping, voice mail and security control. However, building such systems is a tricky task because of the vast range of differences in the human voice. Thus, selecting strong features becomes very crucial for the recognition system. Therefore, a speaker recognition system based on new spin-image descriptors (SISR) is proposed in this paper. In the proposed system, circular windows (spins) are extracted from the frequency domain of the spectrogram image of the sound, and then a run length matrix is built for each spin, to work as a base for feature extraction tasks. Five different descriptors are generated fro
... Show MoreThe technological development in the field of information and communication has been accompanied by the emergence of security challenges related to the transmission of information. Encryption is a good solution. An encryption process is one of the traditional methods to protect the plain text, by converting it into inarticulate form. Encryption implemented can be occurred by using some substitute techniques, shifting techniques, or mathematical operations. This paper proposed a method with two branches to encrypt text. The first branch is a new mathematical model to create and exchange keys, the proposed key exchange method is the development of Diffie-Hellman. It is a new mathematical operations model to exchange keys based on prime num
... Show MoreThis research attempted to take advantage of modern techniques in the study of the superstructural phonetic features of spoken text in language using phonetic programs to achieve more accurate and objective results, far from being limited to self-perception and personal judgment, which varies from person to person.
It should be noted that these phonological features (Nabr, waqf, toning) are performance controls that determine the fate of the meaning of the word or sentence, but in the modern era has received little attention and attention, and that little attention to some of them came to study issues related to the composition or style Therefore, we recommend that more attention should be given to the study of
Exchange of information through the channels of communication can be unsafe. Communication media are not safe to send sensitive information so it is necessary to provide the protection of information from disclosure to unauthorized persons. This research presented the method to information security is done through information hiding into the cover image using a least significant bit (LSB) technique, where a text file is encrypted using a secret sharing scheme. Then, generating positions to hiding information in a random manner of cover image, which is difficult to predict hiding in the image-by-image analysis or statistical analyzes. Where it provides two levels of information security through encryption of a text file using the secret sha
... Show MoreDespite the multiplicity of institutions contributing to the decision-making process in the United States of America, they interact to crystallize positions regarding international and strategic situations. The formulation of the national security policy depends on a number of institutions that complement each other in order to achieve an advanced security situation. Thus, the decision reflects the process of interaction of the existing regulatory institutions. This is because the essence of the national security and achieving its requirements also stems from the existence of a coherent system of shared beliefs and principles in the American society. Besides, these elements are the bases for achieving
... Show MoreMagnetized iron oxide nanoparticles (NPs) were prepared using Eucalyptus leaf extract and then coated with CTAB (Cetrimonium bromide) to increase efficiency. The prepared and modified (NPs) were characterized using AFM, FTIR, and X-ray techniques. The adsorption of the dye reactive blue RB 238 on coated (NPs) was investigated. The effect of various experimental factors, such as the initial concentration of the dye, the amount of adsorbent, pH and temperature on the removal of RB238 was studied. The best conditions for dye removal were found to be 298 K in an acidic medium of pH = 3 and an appropriate dose of the adsorbent of 0.15 g per 25 mg/L to achieve the best color removal of 90% within 60 minutes. The pseudo-second-order re
... Show More