Data security is an important component of data communication and transmission systems. Its main role is to keep sensitive information safe and integrated from the sender to the receiver. The proposed system aims to secure text messages through two security principles encryption and steganography. The system produced a novel method for encryption using graph theory properties; it formed a graph from a password to generate an encryption key as a weight matrix of that graph and invested the Least Significant Bit (LSB) method for hiding the encrypted message in a colored image within a green component. Practical experiments of (perceptibility, capacity, and robustness) were calculated using similarity measures like PSNR, MSE, and SSIM. These measures had proved the efficiency of the system for image quality and hiding messages with PSNR ratio more than 85 dB, MSE ranged (4.537e-05 to 5.27546e-04) and SSIM=1.0 for using a cover file with size ranged from 256×300 to 1200×760 pixels and message ranged from 16 to 300 characters.
The objective of all planning research is to plan for human comfort and safety, and one of the most significant natural dangers to which humans are exposed is earthquake risk; therefore, earthquake risks must be anticipated, and with the advancement of global technology, it is possible to obtain information on earthquake hazards. GIS has been utilized extensively in the field of environmental assessment research due to its high potential, and GIS is a crucial application in seismic risk assessment. This paper examines the methodologies used in recent GIS-based seismic risk studies, their primary environmental impacts on urban areas, and the complexity of the relationship between the applied methodological approaches and the resulting env
... Show MoreThe huge amount of documents in the internet led to the rapid need of text classification (TC). TC is used to organize these text documents. In this research paper, a new model is based on Extreme Machine learning (EML) is used. The proposed model consists of many phases including: preprocessing, feature extraction, Multiple Linear Regression (MLR) and ELM. The basic idea of the proposed model is built upon the calculation of feature weights by using MLR. These feature weights with the extracted features introduced as an input to the ELM that produced weighted Extreme Learning Machine (WELM). The results showed a great competence of the proposed WELM compared to the ELM.
This research was from an introduction, three topics and a conclusion, as follows:
The first topic: the concept of Islamic banks and their emergence and development, which includes three demands are:
The first requirement: the concept of Islamic banks and types, and there are two requirements:
* Definition of Islamic banks language and idiom.
* Types of Islamic banks.
The second requirement: the emergence and development of Islamic banks.
Third requirement: the importance of Islamic banks and their objectives.
We learned about the concept of banks and their origins and how they developed and what are the most important types of Islamic banks
The second topic: Formulas and sources of financing in Islamic banks and
An automatic text summarization system mimics how humans summarize by picking the most significant sentences in a source text. However, the complexities of the Arabic language have become challenging to obtain information quickly and effectively. The main disadvantage of the traditional approaches is that they are strictly constrained (especially for the Arabic language) by the accuracy of sentence feature functions, weighting schemes, and similarity calculations. On the other hand, the meta-heuristic search approaches have a feature tha
... Show MoreA content-based image retrieval (CBIR) is a technique used to retrieve images from an image database. However, the CBIR process suffers from less accuracy to retrieve images from an extensive image database and ensure the privacy of images. This paper aims to address the issues of accuracy utilizing deep learning techniques as the CNN method. Also, it provides the necessary privacy for images using fully homomorphic encryption methods by Cheon, Kim, Kim, and Song (CKKS). To achieve these aims, a system has been proposed, namely RCNN_CKKS, that includes two parts. The first part (offline processing) extracts automated high-level features based on a flatting layer in a convolutional neural network (CNN) and then stores these features in a
... Show MoreThe non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MoreThe text has many connotations in the Arabic language, such as vowel points, designation, completion, etc., and the original meaning of the text is to show. The Western text has its owen independent semantic unit .The biblical texts are a mixture of what was reported by the Prophet Moses (peace be upon him) and what the authors described in terms of texts over many centuries.The meaning of the text is guidance and payment, and it is a natural connotation. The religious text for Muslims is divided into peremptory texts that are national proof. The evidence for the meaning of the text is proven by language, and it is not required that the researcher be a jurist. The approach is a factual questionnaire by the researcher according to a speci
... Show MoreEffect of Using Computer in Getting and Remaining Information at Students of First Stage in Biology Subject MIAAD NATHIM RASHEED LECTURER Abstract The recent research goal is to know the influence of computer use to earn and fulfillment information for students of first class in biology material and to achieve that put many of the zeroing hypothesis by researcher as follow: There were no differences between statistical signs at level (0,05) between the average students' marks who they were study by using computer and between the average student ' marks who they were study in classical method of earning and fulfillment. The researcher chose the intentional of the medical technical institute that included of two branches the first class (A
... Show MoreThe reciprocal relationship between the text and the mask in the printed product is one of the most important relationships that frame the level of communication between the appearance and the interior, although it is not cared for by some designers and publishing institutions. Therefore, the problem of research is determined by the following question (What is the dialectical relationship between the text and the mask in the literary books covers design?). The research aims to shed light on this problematic relationship at the level of reception and aesthetics at the same moment. The theoretical framework included two sections: the first (mask and text ... the concept and the mutual relationship), while the second section (trends in the
... Show MoreIn recent years, social media has been increasing widely and obviously as a media for users expressing their emotions and feelings through thousands of posts and comments related to tourism companies. As a consequence, it became difficult for tourists to read all the comments to determine whether these opinions are positive or negative to assess the success of a tourism company. In this paper, a modest model is proposed to assess e-tourism companies using Iraqi dialect reviews collected from Facebook. The reviews are analyzed using text mining techniques for sentiment classification. The generated sentiment words are classified into positive, negative and neutral comments by utilizing Rough Set Theory, Naïve Bayes and K-Nearest Neighbor
... Show More