Preferred Language
Articles
/
joe-2178
Analysis of Mosul and Haditha Dam Flow Data
...Show More Authors

The expansion in water projects implementations in Turkey and Syria becomes of great concern to the workers in the field of water resources management in Iraq. Such expansion with the absence of bi-lateral agreement between the three riparian countries of Tigris and Euphrates Rivers; Turkey, Syria and Iraq, is expected to lead to a substantially reduction of water inflow to the territories of Iraq. Accordingly, this study consists of two parts: first part is aiming to study the changes of the water inflow to the territory of Iraq, at Turkey and Syria borders, from 1953 to 2009; the results indicated that the annual mean inflow in Tigris River was decreased from 677 m3/sec to 526 m3/sec, after operating Turkey reservoirs, while in the Euphrates River the annual mean inflow was decreased from 1006 m3/sec to 627m3/sec after operating Syria and Turkey reservoirs. Second part is forecasting the monthly inflow and the water demand under the reduced inflow data. The results show that the future inflow of the Tigris River is expected to decrease to 57%, and reaches 301m3/sec. The Mosul reservoir will be able to supply 64% only of the water requirements to the downstream. The share of Iraq from the inflow of the Euphrates River is expected to be 58%, therefore the future inflow will reach 290 m3/sec. The Haditha reservoir will be able to supply 46% only of the water requirements to the downstream, due to reduced inflow at Iraqi border in the future.

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Fri Apr 01 2022
Journal Name
Baghdad Science Journal
Improved Firefly Algorithm with Variable Neighborhood Search for Data Clustering
...Show More Authors

Among the metaheuristic algorithms, population-based algorithms are an explorative search algorithm superior to the local search algorithm in terms of exploring the search space to find globally optimal solutions. However, the primary downside of such algorithms is their low exploitative capability, which prevents the expansion of the search space neighborhood for more optimal solutions. The firefly algorithm (FA) is a population-based algorithm that has been widely used in clustering problems. However, FA is limited in terms of its premature convergence when no neighborhood search strategies are employed to improve the quality of clustering solutions in the neighborhood region and exploring the global regions in the search space. On the

... Show More
View Publication Preview PDF
Scopus (10)
Crossref (3)
Scopus Clarivate Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Baghdad Science Journal
Combining Several Substitution Cipher Algorithms using Circular Queue Data Structure
...Show More Authors

With the revolutionized expansion of the Internet, worldwide information increases the application of communication technology, and the rapid growth of significant data volume boosts the requirement to accomplish secure, robust, and confident techniques using various effective algorithms. Lots of algorithms and techniques are available for data security.  This paper presents a cryptosystem that combines several Substitution Cipher Algorithms along with the Circular queue data structure. The two different substitution techniques are; Homophonic Substitution Cipher and Polyalphabetic Substitution Cipher in which they merged in a single circular queue with four different keys for each of them, which produces eight different outputs for

... Show More
View Publication Preview PDF
Scopus (9)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Mon Aug 01 2016
Journal Name
Journal Of Economics And Administrative Sciences
User (K-Means) for clustering in Data Mining with application
...Show More Authors

 

 

  The great scientific progress has led to widespread Information as information accumulates in large databases is important in trying to revise and compile this vast amount of data and, where its purpose to extract hidden information or classified data under their relations with each other in order to take advantage of them for technical purposes.

      And work with data mining (DM) is appropriate in this area because of the importance of research in the (K-Means) algorithm for clustering data in fact applied with effect can be observed in variables by changing the sample size (n) and the number of clusters (K)

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Jan 01 2023
Journal Name
International Journal Of Data And Network Science
The effects of big data, artificial intelligence, and business intelligence on e-learning and business performance: Evidence from Jordanian telecommunication firms
...Show More Authors

This study sought to investigate the impacts of big data, artificial intelligence (AI), and business intelligence (BI) on Firms' e-learning and business performance at Jordanian telecommunications industry. After the samples were checked, a total of 269 were collected. All of the information gathered throughout the investigation was analyzed using the PLS software. The results show a network of interconnections can improve both e-learning and corporate effectiveness. This research concluded that the integration of big data, AI, and BI has a positive impact on e-learning infrastructure development and organizational efficiency. The findings indicate that big data has a positive and direct impact on business performance, including Big

... Show More
View Publication
Scopus (29)
Crossref (23)
Scopus Crossref
Publication Date
Fri Oct 01 2010
Journal Name
2010 Ieee Symposium On Industrial Electronics And Applications (isiea)
Distributed t-way test suite data generation using exhaustive search method with map and reduce framework
...Show More Authors

View Publication
Scopus (2)
Crossref (2)
Scopus Crossref
Publication Date
Wed Feb 08 2023
Journal Name
Iraqi Journal Of Science
Subsurface 3D Prediction Porosity Model from Converted Seismic and Well Data Using Model Based Inversion Technique
...Show More Authors

Seismic inversion technique is applied to 3D seismic data to predict porosity property for carbonate Yamama Formation (Early Cretaceous) in an area located in southern Iraq. A workflow is designed to guide the manual procedure of inversion process. The inversion use a Model Based Inversion technique to convert 3D seismic data into 3D acoustic impedance depending on low frequency model and well data is the first step in the inversion with statistical control for each inversion stage. Then, training the 3D acoustic impedance volume, seismic data and porosity wells data with multi attribute transforms to find the best statistical attribute that is suitable to invert the point direct measurement of porosity from well to 3D porosity distribut

... Show More
View Publication Preview PDF
Publication Date
Wed Apr 25 2018
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Using Approximation Non-Bayesian Computation with Fuzzy Data to Estimation Inverse Weibull Parameters and Reliability Function
...Show More Authors

        In real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson” and the “Expectation-Maximization” techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Oct 17 2018
Journal Name
Journal Of Economics And Administrative Sciences
New Robust Estimation in Compound Exponential Weibull-Poisson Distribution for both contaminated and non-contaminated Data
...Show More Authors

Abstract

The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.

 

... Show More
View Publication Preview PDF
Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (257)
Crossref (243)
Scopus Clarivate Crossref
Publication Date
Mon Feb 01 2021
Journal Name
Meta Gene
Association analysis of FTO gene polymorphisms rs9939609 and obesity risk among the adults: A systematic review and meta-analysis” Meta Gene (2020) 7–7/100832
...Show More Authors

Background: Obesity typically results from a variety of causes and factors which contribute, genetics included, and style of living choices, and described as excessive body fat accumulation of body fat lead to excessive body, is a chronic disorder that combines pathogenic environmental and genetic factors. So, the current study objective was to investigate the of the FTO gene rs9939609 polymorphism and the obesity risk. Explaining the relationship between fat mass and obesity-associated gene (FTO) rs9939609 polymorphism and obesity in adults. Methods: Identify research exploring the association between the obesity risk and the variation polymorphisms of FTO gene rs9939609. We combined the modified odds ratios (OR) as total groups and subgro

... Show More
View Publication