In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the integration of free geospatial data can be beneficial within domains such as Spatial Data Infrastructures. This was carried out by suggesting a common methodology that uses road networks information such as lengths, centeroids, start and end points, number of nodes and directions to integrate free and open source geospatial datasets. The methodology has been proposed for a particular case study: the use of geospatial data from OpenStreetMap and Google Earth datasets as examples of free data sources. The results revealed possible matching between the roads of OpenStreetMap and Google Earth datasets to serve the development of Spatial Data Infrastructures.
Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreA new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever
... Show MoreThis paper proposes improving the structure of the neural controller based on the identification model for nonlinear systems. The goal of this work is to employ the structure of the Modified Elman Neural Network (MENN) model into the NARMA-L2 structure instead of Multi-Layer Perceptron (MLP) model in order to construct a new hybrid neural structure that can be used as an identifier model and a nonlinear controller for the SISO linear or nonlinear systems. Two learning algorithms are used to adjust the parameters weight of the hybrid neural structure with its serial-parallel configuration; the first one is supervised learning algorithm based Back Propagation Algorithm (BPA) and the second one is an intelligent algorithm n
... Show MoreThe synthesis of new substituted cobalt Phthalocyanine (CoPc) was carried out using starting materials Naphthalene-1,4,5, tetracarbonic acid dianhydride (NDI) employing dry process method. Metal oxides (MO) alloy of (60%Ni3O4 40%-Co3O4 ) have been functionalized with multiwall carbon nanotubes (F-MWCNTs) to produce (F-MWCNTs/MO) nanocomposite (E2) and mixed with CoPc to yield (F-MWCNT/CoPc/MO) (E3). These composites were investigated using different analytical and spectrophotometric methods such as 1H-NMR (0-18 ppm), FTIR spectroscopy in the range of (400-4000cm-1), powder X-rays diffraction (PXRD, 2θ o = 10-80), Raman spectroscopy (0-4000 cm-1), and UV-Visib
... Show MoreIn low-latitude areas less than 10° in latitude angle, the solar radiation that goes into the solar still increases as the cover slope approaches the latitude angle. However, the amount of water that is condensed and then falls toward the solar-still basin is also increased in this case. Consequently, the solar yield still is significantly decreased, and the accuracy of the prediction method is affected. This reduction in the yield and the accuracy of the prediction method is inversely proportional to the time in which the condensed water stays on the inner side of the condensing cover without collection because more drops will fall down into the basin of the solar-still. Different numbers of scraper motions per hour (NSM), that is
... Show MoreGraphite Coated Electrodes (GCE) based on molecularly imprinted polymers were fabricated for the selective potentiometric determination of Risperidone (Ris). The molecularly imprinted (MIP) and nonimprinted (NIP) polymers were synthesized by bulk polymerization using (Ris.) as a template, acrylic acid (AA) and acrylamide (AAm) as monomers, ethylene glycol dimethacrylate (EGDMA) as a cross-linker and benzoyl peroxide (BPO) as an initiator. The imprinted membranes and the non-imprinted membranes were prepared using dioctyl phthalate (DOP) and Dibutylphthalate (DBP) as plasticizers in PVC matrix. The membranes were coated on graphite electrodes. The MIP electrodes using
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreA new simple, sensitive and inexpensive method has been developed for the spectrophotometric determination of cisapride in pharmaceutical formulation. The turbidimetric method is based on the formation of the ion-pair complex between the drug and bromophenol blue (BPB) in presence of potassium chloride at pH= 2.6, with a maximum absorbance at 520 nm. The calibration graph is linear in the concentration range 5-50µg.ml-1 , with good correlation coefficient (r = 0.9989).The limit of detection was found to be 1.14 µg.ml-1 and no interference was observed from common excipients in the pharmaceutical preparation that contain cisapride with good accuracy and precision.
The introduction of concrete damage plasticity material models has significantly improved the accuracy with which the concrete structural elements can be predicted in terms of their structural response. Research into this method's accuracy in analyzing complex concrete forms has been limited. A damage model combined with a plasticity model, based on continuum damage mechanics, is recommended for effectively predicting and simulating concrete behaviour. The damage parameters, such as compressive and tensile damages, can be defined to simulate concrete behavior in a damaged-plasticity model accurately. This research aims to propose an analytical model for assessing concrete compressive damage based on stiffness deterioration. The prop
... Show More