Preferred Language
Articles
/
qYa0d4YBIXToZYALOIvd
DYNAMIC MODELING FOR DISCRETE SURVIVAL DATA BY USING ARTIFICIAL NEURAL NETWORKS AND ITERATIVELY WEIGHTED KALMAN FILTER SMOOTHING WITH COMPARISON
...Show More Authors

Survival analysis is widely applied in data describing for the life time of item until the occurrence of an event of interest such as death or another event of understudy . The purpose of this paper is to use the dynamic approach in the deep learning neural network method, where in this method a dynamic neural network that suits the nature of discrete survival data and time varying effect. This neural network is based on the Levenberg-Marquardt (L-M) algorithm in training, and the method is called Proposed Dynamic Artificial Neural Network (PDANN). Then a comparison was made with another method that depends entirely on the Bayes methodology is called Maximum A Posterior (MAP) method. This method was carried out using numerical algorithms represented by Iteratively Weighted Kalman Filter Smoothing (IWKFS) algorithm and in combination with the Expectation Maximization (EM) algorithm. Average Mean Square Error (AMSE) and Cross Entropy Error (CEE) were used as comparison’s criteria. The methods and procedures were applied to data generated by simulation using a different combination of sample sizes and the number of intervals.

Scopus
Preview PDF
Quick Preview PDF
Publication Date
Sat Jan 01 2022
Journal Name
Geotechnical Engineering And Sustainable Construction
Numerical Modeling of Under Reamed Piles Behavior Under Dynamic Loading in Sandy Soil
...Show More Authors

Under-reamed piles defined by having one or more bulbs have the potential for sizeable major sides over conventional straight-sided piles, most of the studies on under-reamed piles have been conducted on the experimental side, while theoretical studies, such as the finite element method, have been mainly confined to conventional straight-sided piles. On the other hand, although several laboratory and experimental studies have been conducted to study the behavior of under-reamed piles, few numer­ical studies have been carried out to simulate the piles' performance. In addition, there is no research to compare and evaluate the behavior of these piles under dynamic loading. Therefore, this study aimed to numerically investigate bearing capaci

... Show More
View Publication
Scopus (1)
Crossref (1)
Scopus Crossref
Publication Date
Sun Aug 01 2021
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Robust Tests for the Mean Difference in Paired Data by Using Bootstrap Resampling Technique
...Show More Authors

The paired sample t-test for testing the difference between two means in paired data is not robust against the violation of the normality assumption. In this paper, some alternative robust tests have been suggested by using the bootstrap method in addition to combining the bootstrap method with the W.M test. Monte Carlo simulation experiments were employed to study the performance of the test statistics of each of these three tests depending on type one error rates and the power rates of the test statistics. The three tests have been applied on different sample sizes generated from three distributions represented by Bivariate normal distribution, Bivariate contaminated normal distribution, and the Bivariate Exponential distribution.

View Publication Preview PDF
Crossref
Publication Date
Sat Jun 06 2020
Journal Name
Journal Of The College Of Education For Women
Image classification with Deep Convolutional Neural Network Using Tensorflow and Transfer of Learning
...Show More Authors

The deep learning algorithm has recently achieved a lot of success, especially in the field of computer vision. This research aims to describe the classification method applied to the dataset of multiple types of images (Synthetic Aperture Radar (SAR) images and non-SAR images). In such a classification, transfer learning was used followed by fine-tuning methods. Besides, pre-trained architectures were used on the known image database ImageNet. The model VGG16 was indeed used as a feature extractor and a new classifier was trained based on extracted features.The input data mainly focused on the dataset consist of five classes including the SAR images class (houses) and the non-SAR images classes (Cats, Dogs, Horses, and Humans). The Conv

... Show More
View Publication Preview PDF
Crossref (1)
Crossref
Publication Date
Sun Apr 06 2014
Journal Name
Journal Of Economics And Administrative Sciences
Modeling Absolute Deviations Method by using Numerical Methods to measure the dispersion of the proposal for error
...Show More Authors

Is in this research review of the way minimum absolute deviations values ​​based on linear programming method to estimate the parameters of simple linear regression model and give an overview of this model. We were modeling method deviations of the absolute values ​​proposed using a scale of dispersion and composition of a simple linear regression model based on the proposed measure. Object of the work is to find the capabilities of not affected by abnormal values by using numerical method and at the lowest possible recurrence.

 

View Publication Preview PDF
Crossref
Publication Date
Tue Feb 01 2022
Journal Name
Baghdad Science Journal
An Enhanced Approach of Image Steganographic Using Discrete Shearlet Transform and Secret Sharing
...Show More Authors

Recently, the internet has made the users able to transmit the digital media in the easiest manner. In spite of this facility of the internet, this may lead to several threats that are concerned with confidentiality of transferred media contents such as media authentication and integrity verification. For these reasons, data hiding methods and cryptography are used to protect the contents of digital media. In this paper, an enhanced method of image steganography combined with visual cryptography has been proposed. A secret logo (binary image) of size (128x128) is encrypted by applying (2 out 2 share) visual cryptography on it to generate two secret share. During the embedding process, a cover red, green, and blue (RGB) image of size (512

... Show More
View Publication Preview PDF
Scopus (13)
Crossref (8)
Scopus Clarivate Crossref
Publication Date
Fri Mar 01 2024
Journal Name
Baghdad Science Journal
A Comparison between Ericson's Formulae Results and Experimental Data Using New Formulae of Single Particle Level Density
...Show More Authors

The partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter  was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of  are derived from the relation between  and level density parameter . The formulae used to derive  are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on  from the Thomas-Fermi formula show a good agreement with the experimental data.

View Publication Preview PDF
Scopus Crossref
Publication Date
Wed Feb 29 2012
Journal Name
Al-khwarizmi Engineering Journal
Color Image Denoising Using Stationary Wavelet Transform and Adaptive Wiener Filter
...Show More Authors

The denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing.  Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds.  This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin

... Show More
View Publication Preview PDF
Publication Date
Thu Oct 31 2024
Journal Name
Iraqi Geological Journal
Artificial Neural Network Application to Permeability Prediction from Nuclear Magnetic Resonance Log
...Show More Authors

Reservoir permeability plays a crucial role in characterizing reservoirs and predicting the present and future production of hydrocarbon reservoirs. Data logging is a good tool for assessing the entire oil well section's continuous permeability curve. Nuclear magnetic resonance logging measurements are minimally influenced by lithology and offer significant benefits in interpreting permeability. The Schlumberger-Doll-Research model utilizes nuclear magnetic resonance logging, which accurately estimates permeability values. The approach of this investigation is to apply artificial neural networks and core data to predict permeability in wells without a nuclear magnetic resonance log. The Schlumberger-Doll-Research permeability is use

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Wed Sep 30 2015
Journal Name
Iraqi Journal Of Chemical And Petroleum Engineering
Correlation of Penetration Rate with Drilling Parameters For an Iraqi Field Using Mud Logging Data
...Show More Authors

This paper provides an attempt for modeling rate of penetration (ROP) for an Iraqi oil field with aid of mud logging data. Data of Umm Radhuma formation was selected for this modeling. These data include weight on bit, rotary speed, flow rate and mud density. A statistical approach was applied on these data for improving rate of penetration modeling. As result, an empirical linear ROP model has been developed with good fitness when compared with actual data. Also, a nonlinear regression analysis of different forms was attempted, and the results showed that the power model has good predicting capability with respect to other forms.

View Publication Preview PDF
Publication Date
Fri Jan 01 2016
Journal Name
Statistics And Its Interface
Search for risk haplotype segments with GWAS data by use of finite mixture models
...Show More Authors

The region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled

... Show More
View Publication
Scopus Clarivate Crossref