Preferred Language
Articles
/
s4bUd4YBIXToZYALHIsD
Proposing Robust LAD-Atan Penalty of Regression Model Estimation for High Dimensional Data
...Show More Authors

         The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the proposed LAD-Atan estimator has superior performance compared with other estimators.  

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Wed Feb 01 2023
Journal Name
International Journal Of Revolution In Science And Humanity
Nonparametric Estimation of Failure Periods for Log Normal Distribution Using Bootstra
...Show More Authors

A non-parametric kernel method with Bootstrap technology was used to estimate the confidence intervals of the system failure function of the log-normal distribution trace data. These are the times of failure of the machines of the spinning department of the weaving company in Wasit Governorate. Estimating the failure function in a parametric way represented by the method of the maximum likelihood estimator (MLE). The comparison between the parametric and non-parametric methods was done by using the average of Squares Error (MES) criterion. It has been noted the efficiency of the nonparametric methods based on Bootstrap compared to the parametric method. It was also noted that the curve estimation is more realistic and appropriate for the re

... Show More
View Publication
Publication Date
Sun Mar 01 2009
Journal Name
Journal Of Economics And Administrative Sciences
Simulation of five methods for parameter estimation and functionExponential distribution reliability
...Show More Authors
The estimation process is one of the pillars of the statistical inference process as well as the hypothesis test, and the assessment is based on the collection of information and conclusions about the teacher or the community's teachers on the basis of the result
... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Jan 16 2020
Journal Name
Periodicals Of Engineering And Natural Sciences
Comparison of some reliability estimation methods for Laplace distribution using simulations
...Show More Authors

In this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes

Publication Date
Tue Jun 20 2023
Journal Name
Baghdad Science Journal
Detection of Autism Spectrum Disorder Using A 1-Dimensional Convolutional Neural Network
...Show More Authors

Autism Spectrum Disorder, also known as ASD, is a neurodevelopmental disease that impairs speech, social interaction, and behavior. Machine learning is a field of artificial intelligence that focuses on creating algorithms that can learn patterns and make ASD classification based on input data. The results of using machine learning algorithms to categorize ASD have been inconsistent. More research is needed to improve the accuracy of the classification of ASD. To address this, deep learning such as 1D CNN has been proposed as an alternative for the classification of ASD detection. The proposed techniques are evaluated on publicly available three different ASD datasets (children, Adults, and adolescents). Results strongly suggest that 1D

... Show More
View Publication Preview PDF
Scopus (32)
Crossref (22)
Scopus Crossref
Publication Date
Wed May 31 2017
Journal Name
Journal Of Engineering
Evaluating the Performance of High Modulus Asphalt Concrete Mixture for Base Course in Iraq
...Show More Authors

In the 1980s, the French Administration Roads LCPC developed high modulus mixtures (EME) by using hard binder. This type of mixture presented good resistance to moisture damage and improved . mechanical properties for asphalt mixtures including high modulus, good fatigue behaviour and excellent resistance to rutting. In Iraq, this type of mixture has not been used yet. The main objective of this research is to evaluate the performance of high modulus mixtures and comparing them with the conventional mixture, to achieve this objective, asphalt concrete mixes were prepared and then tested to evaluate their engineering properties which include moisture damage, resilient modulus, permanent deformation and fatigue characteristics. These prope

... Show More
View Publication Preview PDF
Publication Date
Fri Apr 26 2019
Journal Name
Journal Of Contemporary Medical Sciences
Breast Cancer Decisive Parameters for Iraqi Women via Data Mining Techniques
...Show More Authors

Objective This research investigates Breast Cancer real data for Iraqi women, these data are acquired manually from several Iraqi Hospitals of early detection for Breast Cancer. Data mining techniques are used to discover the hidden knowledge, unexpected patterns, and new rules from the dataset, which implies a large number of attributes. Methods Data mining techniques manipulate the redundant or simply irrelevant attributes to discover interesting patterns. However, the dataset is processed via Weka (The Waikato Environment for Knowledge Analysis) platform. The OneR technique is used as a machine learning classifier to evaluate the attribute worthy according to the class value. Results The evaluation is performed using

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Tue Jun 04 2024
Journal Name
Computation
High-Performance Krawtchouk Polynomials of High Order Based on Multithreading
...Show More Authors

Orthogonal polynomials and their moments serve as pivotal elements across various fields. Discrete Krawtchouk polynomials (DKraPs) are considered a versatile family of orthogonal polynomials and are widely used in different fields such as probability theory, signal processing, digital communications, and image processing. Various recurrence algorithms have been proposed so far to address the challenge of numerical instability for large values of orders and signal sizes. The computation of DKraP coefficients was typically computed using sequential algorithms, which are computationally extensive for large order values and polynomial sizes. To this end, this paper introduces a computationally efficient solution that utilizes the parall

... Show More
View Publication
Scopus (2)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sat Aug 01 2015
Journal Name
2015 Ieee Conference On Computational Intelligence In Bioinformatics And Computational Biology (cibcb)
Granular computing approach for the design of medical data classification systems
...Show More Authors

View Publication
Scopus (4)
Crossref (3)
Scopus Crossref
Publication Date
Mon Oct 09 2023
Journal Name
2023 Ieee 34th International Symposium On Software Reliability Engineering Workshops (issrew)
Semantics-Based, Automated Preparation of Exploratory Data Analysis for Complex Systems
...Show More Authors

View Publication
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Tue Nov 01 2016
Journal Name
Iosr Journal Of Computer Engineering
Implementation of new Secure Mechanism for Data Deduplication in Hybrid Cloud
...Show More Authors

Cloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of

... Show More
View Publication Preview PDF