Preferred Language
Articles
/
jeasiq-1708
Some NONPARAMETRIC ESTIMATORS FOR RIGHT CENSORED SURVIVAL DATA
...Show More Authors

The using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.

For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence of surveillance (Type Ι- censored data) employing simulation style using (Kaplan-Meier estimation method, Kernel estimation method, Nelson-Aalen estimation method, Thompson–Type estimation method and Pandey estimation method) these methods have the most flexibility in data analysis the statements with no knowledge of the distribution who inserts data for the estimator, to get the best way to assess the permanence function using the simulation method of two of the statisticians measure, (IMSE) Integral Mean Squares Error and Mean Absolute Percentages Error (MAPE) for different sample size like (n  =  15,  30,   50  ,100   ), has been reached to a preference of the best way to estimate is Kaplan-Meier method from the remainder of the nonparametric methods, the results show  that  permanence function values  start to  decrease with  increasing  of time  in  relation  to  nonparametric estimation. 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (3)
Scopus
Publication Date
Wed Jan 01 2020
Journal Name
Advances In Science, Technology And Engineering Systems Journal
Bayes Classification and Entropy Discretization of Large Datasets using Multi-Resolution Data Aggregation
...Show More Authors

Big data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a

... Show More
View Publication
Scopus Crossref
Publication Date
Tue Dec 01 2015
Journal Name
Journal Of Engineering
Ten Years of OpenStreetMap Project: Have We Addressed Data Quality Appropriately? – Review Paper
...Show More Authors

It has increasingly been recognised that the future developments in geospatial data handling will centre on geospatial data on the web: Volunteered Geographic Information (VGI). The evaluation of VGI data quality, including positional and shape similarity, has become a recurrent subject in the scientific literature in the last ten years. The OpenStreetMap (OSM) project is the most popular one of the leading platforms of VGI datasets. It is an online geospatial database to produce and supply free editable geospatial datasets for a worldwide. The goal of this paper is to present a comprehensive overview of the quality assurance of OSM data. In addition, the credibility of open source geospatial data is discussed, highlight

... Show More
View Publication Preview PDF
Publication Date
Mon Jan 03 2022
Journal Name
Iraqi Journal Of Science
Accuracy Assessment of 3D Model Based on Laser Scan and Photogrammetry Data: Introduction
...Show More Authors

    A three-dimensional (3D) model extraction represents the best way to reflect the reality in all details. This explains the trends and tendency of many scientific disciplines towards making measurements, calculations and monitoring in various fields using such model. Although there are many ways to produce the 3D model like as images, integration techniques, and laser scanning, however, the quality of their products is not the same in terms of accuracy and detail. This article aims to assess the 3D point clouds model accuracy results from close range images and laser scan data based on Agi soft photoscan and cloud compare software to determine the compatibility of both datasets for several applications. College of Scien

... Show More
View Publication Preview PDF
Scopus (4)
Scopus Crossref
Publication Date
Sun Jan 01 2023
Journal Name
Journal Of Engineering
State-of-the-Art in Data Integrity and Privacy-Preserving in Cloud Computing
...Show More Authors

Cloud computing (CC) is a fast-growing technology that offers computers, networking, and storage services that can be accessed and used over the internet. Cloud services save users money because they are pay-per-use, and they save time because they are on-demand and elastic, a unique aspect of cloud computing. However, several security issues must be addressed before users store data in the cloud. Because the user will have no direct control over the data that has been outsourced to the cloud, particularly personal and sensitive data (health, finance, military, etc.), and will not know where the data is stored, the user must ensure that the cloud stores and maintains the outsourced data appropriately. The study's primary goals are to mak

... Show More
View Publication Preview PDF
Crossref (3)
Crossref
Publication Date
Mon Dec 05 2022
Journal Name
Baghdad Science Journal
A Security and Privacy Aware Computing Approach on Data Sharing in Cloud Environment
...Show More Authors

Today, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int

... Show More
View Publication Preview PDF
Scopus (4)
Scopus Crossref
Publication Date
Wed Dec 30 2020
Journal Name
Iraqi Journal Of Science
A Comparison of Different Estimation Methods to Handle Missing Data in Explanatory Variables
...Show More Authors

Missing data is one of the problems that may occur in regression models. This problem is usually handled by deletion mechanism available in statistical software. This method reduces statistical inference values because deletion affects sample size. In this paper, Expectation Maximization algorithm (EM), Multicycle-Expectation-Conditional Maximization algorithm (MC-ECM), Expectation-Conditional Maximization Either (ECME), and Recurrent Neural Networks (RNN) are used to estimate multiple regression models when explanatory variables have some missing values. Experimental dataset were generated using Visual Basic programming language with missing values of explanatory variables according to a missing mechanism at random general pattern and s

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Tue Nov 30 2021
Journal Name
Iraqi Journal Of Science
Strong Triple Data Encryption Standard Algorithm using Nth Degree Truncated Polynomial Ring Unit
...Show More Authors

Cryptography is the process of transforming message to avoid an unauthorized access of data. One of the main problems and an important part in cryptography with secret key algorithms is key. For higher level of secure communication key plays an important role. For increasing the level of security in any communication, both parties must have a copy of the secret key which, unfortunately, is not that easy to achieve. Triple Data Encryption Standard algorithm is weak due to its weak key generation, so that key must be reconfigured to make this algorithm more secure, effective, and strong. Encryption key enhances the Triple Data Encryption Standard algorithm securities. This paper proposed a combination of two efficient encryption algorithms

... Show More
View Publication Preview PDF
Publication Date
Sun Jun 30 2024
Journal Name
International Journal Of Intelligent Engineering And Systems
Eco-friendly and Secure Data Center to Detection Compromised Devices Utilizing Swarm Approach
...Show More Authors

Modern civilization increasingly relies on sustainable and eco-friendly data centers as the core hubs of intelligent computing. However, these data centers, while vital, also face heightened vulnerability to hacking due to their role as the convergence points of numerous network connection nodes. Recognizing and addressing this vulnerability, particularly within the confines of green data centers, is a pressing concern. This paper proposes a novel approach to mitigate this threat by leveraging swarm intelligence techniques to detect prospective and hidden compromised devices within the data center environment. The core objective is to ensure sustainable intelligent computing through a colony strategy. The research primarily focusses on the

... Show More
View Publication
Scopus (3)
Scopus Crossref
Publication Date
Fri Dec 01 2017
Journal Name
Journal Of Economics And Administrative Sciences
مقارنة مقدر بيز مع طريقة الامكان الاعظم لتقدير معلمتي معكوس التوزيع الاسي المعمم في حالة ضبابية البيانات
...Show More Authors

In this paper, the generalized inverted exponential distribution is considered as one of the most important distributions in studying failure times. A shape and scale parameters of the distribution have been estimated after removing the fuzziness that characterizes its data because they are triangular fuzzy numbers. To convert the fuzzy data to crisp data the researcher has used the centroid method. Hence the studied distribution has two parameters which show a difficulty in separating and estimating them directly of the MLE method. The Newton-Raphson method has been used.

... Show More
View Publication Preview PDF
Crossref