Today, there are large amounts of geospatial data available on the web such as Google Map (GM), OpenStreetMap (OSM), Flickr service, Wikimapia and others. All of these services called open source geospatial data. Geospatial data from different sources often has variable accuracy due to different data collection methods; therefore data accuracy may not meet the user requirement in varying organization. This paper aims to develop a tool to assess the quality of GM data by comparing it with formal data such as spatial data from Mayoralty of Baghdad (MB). This tool developed by Visual Basic language, and validated on two different study areas in Baghdad / Iraq (Al-Karada and Al- Kadhumiyah). The positional accuracy was asses
... Show MoreThe aim of this study is to design a proposed model for a document to insure the mistakes of the medical profession in estimating the compensation for medical errors. The medical profession is an honest profession aimed primarily at serving human and human beings. In this case, the doctor may be subject to error and error , And the research has adopted the descriptive approach and the research reached several conclusions, the most prominent of which is no one to bear the responsibility of medical error, although the responsibility shared and the doctor contributes to them, doctors do not deal with patients according to their educational level and cultural and there are some doctors do not inform patients The absence of a document to insu
... Show MoreWithin this paper, we developed a new series of organic chromophores based on triphenyleamine (TPA) (AL1, AL-2, AL-11 and AL-22) by engineering the structure of the electron donor (D) unit via replacing a phenyle ring or inserting thiophene as a π-linkage. For the sake of scrutinizing the impact of the TPA donating ability and the spacer upon the photovoltaic, absorptional, energetic, and geometrical characteristic of these sensitizers, density functional theory (DFT) and time-dependent DFT (TD-DFT) have been utilized. According to structural characteristics, incorporating the acceptor, π-bridge and TPA does not result in a perfect coplanar conformation in AL-22. We computed EHOMO, ELUMO and bandgap (Eg) energies by performing frequency a
... Show MoreThis study examined the problem of identifying the vocabulary of the methodology of teaching Arabic language in the faculties of Media. The researcher noticed the existence of an overlap between the syllabuses of the general specialization of the Arabic language and its Media sections in the universities with the special professional vocabulary that suits the study of the media student. Thus ,this study is regarded as a real attempt to present a methodological model of media language that concerns with fillfuling students ‘linguistic and knowledgeable needs relying on measuring their benefits from the methodological Arabic curriculum . Key words:problem, teaching’ methodology of Arabic language, media language.
... Show MoreThe aim of this paper is to present a new methodology to find the private key of RSA. A new initial value which is generated from a new equation is selected to speed up the process. In fact, after this value is found, brute force attack is chosen to discover the private key. In addition, for a proposed equation, the multiplier of Euler totient function to find both of the public key and the private key is assigned as 1. Then, it implies that an equation that estimates a new initial value is suitable for the small multiplier. The experimental results show that if all prime factors of the modulus are assigned larger than 3 and the multiplier is 1, the distance between an initial value and the private key
... Show MoreThis study was aimed to investigate the response surface methodology (RSM) to evaluate the effects of various experimental conditions on the removal of levofloxacin (LVX) from the aqueous solution by means of electrocoagulation (EC) technique with stainless steel electrodes. The EC process was achieved successfully with the efficiency of LVX removal of 90%. The results obtained from the regression analysis, showed that the data of experiential are better fitted to the polynomial model of second-order with the predicted correlation coefficient (pred. R2) of 0.723, adjusted correlation coefficient (Adj. R2) of 0.907 and correlation coefficient values (R2) of 0.952. This shows that the predicted models and experimental values are in go
... Show MoreIn this research Artificial Neural Network (ANN) technique was applied to study the filtration process in water treatment. Eight models have been developed and tested using data from a pilot filtration plant, working under different process design criteria; influent turbidity, bed depth, grain size, filtration rate and running time (length of the filtration run), recording effluent turbidity and head losses. The ANN models were constructed for the prediction of different performance criteria in the filtration process: effluent turbidity, head losses and running time. The results indicate that it is quite possible to use artificial neural networks in predicting effluent turbidity, head losses and running time in the filtration process, wi
... Show MoreIn this paper we describe several different training algorithms for feed forward neural networks(FFNN). In all of these algorithms we use the gradient of the performance function, energy function, to determine how to adjust the weights such that the performance function is minimized, where the back propagation algorithm has been used to increase the speed of training. The above algorithms have a variety of different computation and thus different type of form of search direction and storage requirements, however non of the above algorithms has a global properties which suited to all problems.