The phenomena of Dust storm take place in barren and dry regions all over the world. It may cause by intense ground winds which excite the dust and sand from soft, arid land surfaces resulting it to rise up in the air. These phenomena may cause harmful influences upon health, climate, infrastructure, and transportation. GIS and remote sensing have played a key role in studying dust detection. This study was conducted in Iraq with the objective of validating dust detection. These techniques have been used to derive dust indices using Normalized Difference Dust Index (NDDI) and Middle East Dust Index (MEDI), which are based on images from MODIS and in-situ observation based on hourly wind speed and visibility during May 4-5 2022 and 25-26 June 2022. In this study, the appropriateness of two various MODIS-based techniques to discover dust in 13 stations in Iraq was examined. The results suggest NDDI index is the most appropriate index to identifying dust storms across Iraq. Also, the MEDI index has impairment to discover dust through multiple land-cover forms. Beside that MEDI consider an ineffective index to detect and discover dust storms throughout whole kinds of land cover over Iraq.
Iris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreCryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms to
... Show MoreSpatial data observed on a group of areal units is common in scientific applications. The usual hierarchical approach for modeling this kind of dataset is to introduce a spatial random effect with an autoregressive prior. However, the usual Markov chain Monte Carlo scheme for this hierarchical framework requires the spatial effects to be sampled from their full conditional posteriors one-by-one resulting in poor mixing. More importantly, it makes the model computationally inefficient for datasets with large number of units. In this article, we propose a Bayesian approach that uses the spectral structure of the adjacency to construct a low-rank expansion for modeling spatial dependence. We propose a pair of computationally efficient estimati
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreMalicious software (malware) performs a malicious function that compromising a computer system’s security. Many methods have been developed to improve the security of the computer system resources, among them the use of firewall, encryption, and Intrusion Detection System (IDS). IDS can detect newly unrecognized attack attempt and raising an early alarm to inform the system about this suspicious intrusion attempt. This paper proposed a hybrid IDS for detection intrusion, especially malware, with considering network packet and host features. The hybrid IDS designed using Data Mining (DM) classification methods that for its ability to detect new, previously unseen intrusions accurately and automatically. It uses both anomaly and misuse dete
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreThis article describes how to predict different types of multiple reflections in pre-track seismic data. The characteristics of multiple reflections can be expressed as a combination of the characteristics of primary reflections. Multiple velocities always come in lower magnitude than the primaries, this is the base for separating them during Normal Move Out correction. The muting procedure is applied in Time-Velocity analysis domain. Semblance plot is used to diagnose multiples availability and judgment for muting dimensions. This processing procedure is used to eliminate internal multiples from real 2D seismic data from southern Iraq in two stages. The first is conventional Normal Move Out correction and velocity auto picking and
... Show MoreHighly plastic soils exhibit unfavorited properties upon saturation, which produce different defects in engineering structures. Attempts were made by researchers to proffer solutions to these defects by experimenting in practical ways. This included various materials that could possibly improve the soil engineering properties and reduce environmental hazards. This paper investigates the strength behavior of highly plastic clay stabilized with brick dust. The brick dust contents were 10%, 20%, and 30% by dry weight of soil. A series of linear shrinkage and unconfined compression tests were carried out to study the effect of brick dust on the quantitative amount of shrinkage experienced by highly plastic clay and the undra
... Show More