Preferred Language
Articles
/
joe-342
Development of Spatial Data Infrastructure based on Free Data Integration
...Show More Authors

In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the integration of free geospatial data can be beneficial within domains such as Spatial Data Infrastructures. This was carried out by suggesting a common methodology that uses road networks information such as lengths, centeroids, start and end points, number of nodes and directions to integrate free and open source geospatial datasets. The methodology has been proposed for a particular case study: the use of geospatial data from OpenStreetMap and Google Earth datasets as examples of free data sources. The results revealed possible matching between the roads of OpenStreetMap and Google Earth datasets to serve the development of Spatial Data Infrastructures.

 

View Publication Preview PDF
Quick Preview PDF
Publication Date
Sat Aug 01 2015
Journal Name
Journal Of Engineering
A Real-Time Fuzzy Load Flow and Contingency Analysis Based on Gaussian Distribution System
...Show More Authors

Fuzzy logic is used to solve the load flow and contingency analysis problems, so decreasing computing time and its the best selection instead of the traditional methods. The proposed  method is very accurate with outstanding computation time, which made the fuzzy load flow (FLF) suitable for real time application for small- as well as large-scale power systems. In addition that, the FLF efficiently able to solve load flow problem of ill-conditioned power systems and contingency analysis. The FLF method using Gaussian membership function requires less number of iterations and less computing time than that required in the FLF method using triangular membership function. Using sparsity technique for the input Ybus sparse matrix data gi

... Show More
View Publication Preview PDF
Publication Date
Sat Jun 27 2020
Journal Name
Iraqi Journal Of Science
Hartha Formation divisions Based on Well Logs Analysis in Majnoon Oil Field, Southern Iraq
...Show More Authors

This study aims to evaluate reservoir characteristics of Hartha Formation in Majnoon oil field based on well logs data for three wells (Mj-1, Mj-3 and Mj-11). Log interpretation was carried out by using a full set of logs to calculate main petrophysical properties such as effective porosity and water saturation, as well as to find the volume of shale. The evaluation of the formation included computer processes interpretation (CPI) using Interactive Petrophysics (IP) software.  Based on the results of CPI, Hartha Formation is divided into five reservoir units (A1, A2, A3, B1, B2), deposited in a ramp setting. Facies associations is added to well logs interpretation of Hartha Formation, and was inferred by a microfacies analysis of th

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref
Publication Date
Sun Dec 03 2017
Journal Name
Baghdad Science Journal
Network Self-Fault Management Based on Multi-Intelligent Agents and Windows Management Instrumentation (WMI)
...Show More Authors

This paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Thu Nov 30 2023
Journal Name
Iraqi Journal Of Science
A Lightweight Image Encryption Algorithm Based on Elliptic Curves and a 5D Logistic Map
...Show More Authors

    Cryptography can be thought of as a toolbox, where potential attackers gain access to various computing resources and technologies to try to compute key values. In modern cryptography, the strength of the encryption algorithm is only determined by the size of the key. Therefore, our goal is to create a strong key value that has a minimum bit length that will be useful in light encryption. Using elliptic curve cryptography (ECC) with Rubik's cube and image density, the image colors are combined and distorted, and by using the Chaotic Logistics Map and Image Density with a secret key, the Rubik's cubes for the image are encrypted, obtaining a secure image against attacks. ECC itself is a powerful algorithm that generates a pair of p

... Show More
View Publication
Scopus (3)
Crossref (2)
Scopus Crossref
Publication Date
Tue Sep 25 2018
Journal Name
Iraqi Journal Of Science
Refractive Index Sensor Based on Micro- Structured Optical Fibers with Using Finite Element Method
...Show More Authors

In this paper a refractive index sensor based on micro-structured optical fiber has been proposed using Finite Element Method (FEM). The designed fiber has a hexagonal cladding structure with six air holes rings running around its solid core.  The air holes of fiber has been infiltrated  with different liquids such as water , ethanol, methanol, and toluene then sensor characteristics like ; effective refractive index , confinement loss, beam profile of the fundamental mode, and sensor resolution are investigated by employing the FEM. This designed sensor characterized by its low confinement loss and high resolution so a small change in the analyte refractive index could be detect which is could be useful to detect the change of

... Show More
View Publication Preview PDF
Publication Date
Thu Nov 30 2023
Journal Name
Iraqi Journal Of Science
Attention Mechanism Based on a Pre-trained Model for Improving Arabic Fake News Predictions
...Show More Authors

     Social media and news agencies are major sources for tracking news and events. With these sources' massive amounts of data, it is easy to spread false or misleading information. Given the great dangers of fake news to societies, previous studies have given great attention to detecting it and limiting its impact. As such, this work aims to use modern deep learning techniques to detect Arabic fake news. In the proposed system, the attention model is adapted with bidirectional long-short-term memory (Bi-LSTM) to identify the most informative words in the sentence. Then, a multi-layer perceptron (MLP) is applied to classify news articles as fake or real. The experiments are conducted on a newly launched Arabic dataset called the Ara

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Mon Feb 20 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
A Hybrid Algorithm to Protect Computer Networks Based on Human Biometrics and Computer Attributes
...Show More Authors

Objective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu

... Show More
View Publication Preview PDF
Publication Date
Sat Sep 21 2013
Journal Name
Nonlinear Dynamics
BER performance enhancement for secure wireless optical communication systems based on chaotic MIMO techniques
...Show More Authors

View Publication
Scopus (7)
Crossref (7)
Scopus Clarivate Crossref
Publication Date
Sun Apr 26 2020
Journal Name
Iraqi Journal Of Science
Selective Image Encryption Based on DCT, Hybrid Shift Coding and Randomly Generated Secret Key
...Show More Authors

Most of today’s techniques encrypt all of the image data, which consumes a tremendous amount of time and computational payload. This work introduces a selective image encryption technique that encrypts predetermined bulks of the original image data in order to reduce the encryption/decryption time and the
computational complexity of processing the huge image data. This technique is applying a compression algorithm based on Discrete Cosine Transform (DCT). Two approaches are implemented based on color space conversion as a preprocessing for the compression phases YCbCr and RGB, where the resultant compressed sequence is selectively encrypted using randomly generated combined secret key.
The results showed a significant reduct

... Show More
View Publication Preview PDF
Scopus (14)
Crossref (5)
Scopus Crossref
Publication Date
Wed Nov 01 2017
Journal Name
Journal Of Computational And Theoretical Nanoscience
Solution for Multi-Objective Optimisation Master Production Scheduling Problems Based on Swarm Intelligence Algorithms
...Show More Authors

The emphasis of Master Production Scheduling (MPS) or tactic planning is on time and spatial disintegration of the cumulative planning targets and forecasts, along with the provision and forecast of the required resources. This procedure eventually becomes considerably difficult and slow as the number of resources, products and periods considered increases. A number of studies have been carried out to understand these impediments and formulate algorithms to optimise the production planning problem, or more specifically the master production scheduling (MPS) problem. These algorithms include an Evolutionary Algorithm called Genetic Algorithm, a Swarm Intelligence methodology called Gravitational Search Algorithm (GSA), Bat Algorithm (BAT), T

... Show More
View Publication Preview PDF
Scopus (13)
Crossref (11)
Scopus Crossref