Preferred Language
Articles
/
ahd4apIBVTCNdQwCPK8Y
Assessing the accuracy of 'crowdsourced' data and its integration with official spatial data sets
...Show More Authors

Scopus
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Slice inverse regression with the principal components in reducing high-dimensions data by using simulation
...Show More Authors

This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions,    (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Feb 01 2024
Journal Name
Baghdad Science Journal
Estimating the Parameters of Exponential-Rayleigh Distribution for Progressively Censoring Data with S- Function about COVID-19
...Show More Authors

The two parameters of Exponential-Rayleigh distribution were estimated using the maximum likelihood estimation method (MLE) for progressively censoring data. To find estimated values for these two scale parameters using real data for COVID-19 which was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. Then the Chi-square test was utilized to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). Employing the nonlinear membership function (s-function) to find fuzzy numbers for these parameters estimators. Then utilizing the ranking function transforms the fuzzy numbers into crisp numbers. Finally, using mean square error (MSE) to compare the outcomes of the survival

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Sun Jan 01 2023
Journal Name
Journal Of Engineering
State-of-the-Art in Data Integrity and Privacy-Preserving in Cloud Computing
...Show More Authors

Cloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak

... Show More
View Publication Preview PDF
Crossref (3)
Crossref
Publication Date
Sun Jan 01 2023
Journal Name
International Journal Of Economics And Finance Studies
INTEGRATION BETWEEN COBIT AND COSO FOR INTERNAL CONTROL AND ITS REFLECTION ON AUDITING RISK WITH CORPORATE GOVERNANCE AS THE MEDIATING VARIABLE
...Show More Authors

Scopus (19)
Scopus
Publication Date
Tue Mar 01 2016
Journal Name
Journal Of Engineering
Analysis of Recorded Inflow Data of Ataturk Reservoir
...Show More Authors

Since the beginning of the last century, the competition for water resources has intensified dramatically, especially between countries that have no agreements in place for water resources that they share. Such is the situation with the Euphrates River which flows through three countries (Turkey, Syria, and Iraq) and represents the main water resource for these countries. Therefore, the comprehensive hydrologic investigation needed to derive optimal operations requires reliable forecasts. This study aims to analysis and create a forecasting model for data generation from Turkey perspective by using the recorded inflow data of Ataturk reservoir for the period (Oct. 1961 - Sep. 2009). Based on 49 years of real inflow data

... Show More
View Publication Preview PDF
Publication Date
Fri Oct 01 2010
Journal Name
2010 Ieee Symposium On Industrial Electronics And Applications (isiea)
Distributed t-way test suite data generation using exhaustive search method with map and reduce framework
...Show More Authors

View Publication
Scopus (2)
Crossref (2)
Scopus Crossref
Publication Date
Fri Apr 14 2023
Journal Name
Journal Of Big Data
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
...Show More Authors
Abstract<p>Data scarcity is a major challenge when training deep learning (DL) models. DL demands a large amount of data to achieve exceptional performance. Unfortunately, many applications have small or inadequate data to train DL frameworks. Usually, manual labeling is needed to provide labeled data, which typically involves human annotators with a vast background of knowledge. This annotation process is costly, time-consuming, and error-prone. Usually, every DL framework is fed by a significant amount of labeled data to automatically learn representations. Ultimately, a larger amount of data would generate a better DL model and its performance is also application dependent. This issue is the main barrier for</p> ... Show More
View Publication Preview PDF
Scopus (322)
Crossref (326)
Scopus Clarivate Crossref
Publication Date
Wed Apr 25 2018
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Using Approximation Non-Bayesian Computation with Fuzzy Data to Estimation Inverse Weibull Parameters and Reliability Function
...Show More Authors

        In real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson” and the “Expectation-Maximization” techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function i

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Mon Oct 01 2018
Journal Name
International Journal Of Electrical And Computer Engineering
Load balance in data center SDN networks
...Show More Authors

In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and

... Show More
Publication Date
Sun Apr 01 2018
Journal Name
Journal Of Engineering And Applied Sciences
New Data Security Method Based on Biometrics
...Show More Authors

Merging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering

... Show More