This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random encryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.
This research aims at studying the websites of Iraqi ministries to determine the extent of the use of electronic communication in the practice of public relations' activities through these sites, which represent a formal means of communication between the ministry and its people.
The research consists of three chapters: chapter one studies the methodological framework of the research; chapter two includes three units: unit one studies technologies of electronic communication including its concept, features and types; unit two studies electronic publications i.e. its concept and features; and unit three deals with designing the electronic websites .it ends with chapter three which is divided into two sections: section one studies the
The purpose of this paper is applying the robustness in Linear programming(LP) to get rid of uncertainty problem in constraint parameters, and find the robust optimal solution, to maximize the profits of the general productive company of vegetable oils for the year 2019, through the modify on a mathematical model of linear programming when some parameters of the model have uncertain values, and being processed it using robust counterpart of linear programming to get robust results from the random changes that happen in uncertain values of the problem, assuming these values belong to the uncertainty set and selecting the values that cause the worst results and to depend buil
... Show MoreFor many problems in Physics and Computational Fluid Dynamics (CFD), providing an accurate approximation of derivatives is a challenging task. This paper presents a class of high order numerical schemes for approximating the first derivative. These approximations are derived based on solving a special system of equations with some unknown coefficients. The construction method provides numerous types of schemes with different orders of accuracy. The accuracy of each scheme is analyzed by using Fourier analysis, which illustrates the dispersion and dissipation of the scheme. The polynomial technique is used to verify the order of accuracy of the proposed schemes by obtaining the error terms. Dispersion and dissipation errors are calculated
... Show MoreIntegrated project delivery is collaboratively applying the skills and knowledge of all participants to optimize the project's results, increase owner value, decrease waste, and maximize efficiency during the design, fabrication, and construction processes. This study aims to determine IPD criteria positively impacting value engineering. To do this, the study has considered 9 main criteria according to PMP classification that already covers all project phases and 183 sub-criteria obtained from theoretical study and expert interviews (fieldwork). In this study, the SPSS (V26) program was used to analyze the main criteria and sub-criteria priorities from top to bottom according to their values of the Relative Importance In
... Show MoreThe limited availability of the two-circle diffractometer to collect intensity measurements down to the monoclinic system has been extended in a novel procedure to collect intensities for the triclinic system. The procedure involves the derivation of matrix elements from graphical representation of the reciprocal lattice. Offset of the origins of the upper layers from that of the zero-layer - characteristic of triclinic system - is determined and the 3 x 3 matrix elements are evaluated accordingly. Details of crystal alignment by X-rays for the triclinic system utilizing the intensities of equivalent reflections is described
In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
In recent years, the number of applications utilizing mobile wireless sensor networks (WSNs) has increased, with the intent of localization for the purposes of monitoring and obtaining data from hazardous areas. Location of the event is very critical in WSN, as sensing data is almost meaningless without the location information. In this paper, two Monte Carlo based localization schemes termed MCL and MSL* are studied. MCL obtains its location through anchor nodes whereas MSL* uses both anchor nodes and normal nodes. The use of normal nodes would increase accuracy and reduce dependency on anchor nodes, but increases communication costs. For this reason, we introduce a new approach called low communication cost schemes to reduce communication
... Show MoreSentiment analysis is one of the major fields in natural language processing whose main task is to extract sentiments, opinions, attitudes, and emotions from a subjective text. And for its importance in decision making and in people's trust with reviews on web sites, there are many academic researches to address sentiment analysis problems. Deep Learning (DL) is a powerful Machine Learning (ML) technique that has emerged with its ability of feature representation and differentiating data, leading to state-of-the-art prediction results. In recent years, DL has been widely used in sentiment analysis, however, there is scarce in its implementation in the Arabic language field. Most of the previous researches address other l
... Show More