This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random encryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.
The phenomena of Dust storm take place in barren and dry regions all over the world. It may cause by intense ground winds which excite the dust and sand from soft, arid land surfaces resulting it to rise up in the air. These phenomena may cause harmful influences upon health, climate, infrastructure, and transportation. GIS and remote sensing have played a key role in studying dust detection. This study was conducted in Iraq with the objective of validating dust detection. These techniques have been used to derive dust indices using Normalized Difference Dust Index (NDDI) and Middle East Dust Index (MEDI), which are based on images from MODIS and in-situ observation based on hourly wi
This paper provides an attempt for modeling rate of penetration (ROP) for an Iraqi oil field with aid of mud logging data. Data of Umm Radhuma formation was selected for this modeling. These data include weight on bit, rotary speed, flow rate and mud density. A statistical approach was applied on these data for improving rate of penetration modeling. As result, an empirical linear ROP model has been developed with good fitness when compared with actual data. Also, a nonlinear regression analysis of different forms was attempted, and the results showed that the power model has good predicting capability with respect to other forms.
This research includes structure interpretation of the Yamama Formation (Lower Cretaceous) and the Naokelekan Formation (Jurassic) using 2D seismic reflection data of the Tuba oil field region, Basrah, southern Iraq. The two reflectors (Yamama and Naokelekan) were defined and picked as peak and tough depending on the 2D seismic reflection interpretation process, based on the synthetic seismogram and well log data. In order to obtain structural settings, these horizons were followed over all the regions. Two-way travel-time maps, depth maps, and velocity maps have been produced for top Yamama and top Naokelekan formations. The study concluded that certain longitudinal enclosures reflect anticlines in the east and west of the study ar
... Show MoreA band rationing method is applied to calculate the salinity index (SI) and Normalized Multi-Band Drought Index (NMDI) as pre-processing to take Agriculture decision in these areas is presented. To separate the land from other features that exist in the scene, the classical classification method (Maximum likelihood classification) is used by classified the study area to multi classes (Healthy vegetation (HV), Grasslands (GL), Water (W), Urban (U), Bare Soil (BS)). A Landsat 8 satellite image of an area in the south of Iraq are used, where the land cover is classified according to indicator ranges for each (SI) and (NMDI).
The performance quality and searching speed of Block Matching (BM) algorithm are affected by shapes and sizes of the search patterns used in the algorithm. In this paper, Kite Cross Hexagonal Search (KCHS) is proposed. This algorithm uses different search patterns (kite, cross, and hexagonal) to search for the best Motion Vector (MV). In first step, KCHS uses cross search pattern. In second step, it uses one of kite search patterns (up, down, left, or right depending on the first step). In subsequent steps, it uses large/small Hexagonal Search (HS) patterns. This new algorithm is compared with several known fast block matching algorithms. Comparisons are based on search points and Peak Signal to Noise Ratio (PSNR). According to resul
... Show MoreThis research aims at studying the websites of Iraqi ministries to determine the extent of the use of electronic communication in the practice of public relations' activities through these sites, which represent a formal means of communication between the ministry and its people.
The research consists of three chapters: chapter one studies the methodological framework of the research; chapter two includes three units: unit one studies technologies of electronic communication including its concept, features and types; unit two studies electronic publications i.e. its concept and features; and unit three deals with designing the electronic websites .it ends with chapter three which is divided into two sections: section one studies the
In data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
Integrated project delivery is collaboratively applying the skills and knowledge of all participants to optimize the project's results, increase owner value, decrease waste, and maximize efficiency during the design, fabrication, and construction processes. This study aims to determine IPD criteria positively impacting value engineering. To do this, the study has considered 9 main criteria according to PMP classification that already covers all project phases and 183 sub-criteria obtained from theoretical study and expert interviews (fieldwork). In this study, the SPSS (V26) program was used to analyze the main criteria and sub-criteria priorities from top to bottom according to their values of the Relative Importance In
... Show More