Preferred Language
Articles
/
ijs-6138
Fuzzy Entropy in Adaptive Fuzzy Weighted Linear Regression Analysis with Application to Estimate Infant Mortality Rate
...Show More Authors

An adaptive fuzzy weighted linear regression model in which the output is based
on the position and entropy of quadruple fuzzy numbers had dealt with. The solution
of the adaptive models is established in terms of the iterative fuzzy least squares by
introducing a new suitable metric which takes into account the types of the influence
of different imprecisions. Furthermore, the applicability of the model is made by
attempting to estimate the fuzzy infant mortality rate in Iraq using a selective set of
inputs.

View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Dec 01 2022
Journal Name
Baghdad Science Journal
The Approximation of Weighted Hölder Functions by Fourier-Jacobi Polynomials to the Singular Sturm-Liouville Operator
...Show More Authors

      In this work, a weighted H lder function that approximates a Jacobi polynomial which solves the second order singular Sturm-Liouville equation is discussed. This is generally equivalent to the Jacobean translations and the moduli of smoothness. This paper aims to focus on improving methods of approximation and finding the upper and lower estimates for the degree of approximation in weighted H lder spaces by modifying the modulus of continuity and smoothness. Moreover, some properties for the moduli of smoothness with direct and inverse results are considered.

View Publication Preview PDF
Scopus (2)
Scopus Clarivate Crossref
Publication Date
Tue May 30 2023
Journal Name
Iraqi Journal Of Science
Entropy-Based Feature Selection using Extra Tree Classifier for IoT Security
...Show More Authors

      The Internet of Things (IoT) is a network of devices used for interconnection and data transfer. There is a dramatic increase in IoT attacks due to the lack of security mechanisms. The security mechanisms can be enhanced through the analysis and classification of these attacks. The multi-class classification of IoT botnet attacks (IBA) applied here uses a high-dimensional data set. The high-dimensional data set is a challenge in the classification process due to the requirements of a high number of computational resources. Dimensionality reduction (DR) discards irrelevant information while retaining the imperative bits from this high-dimensional data set. The DR technique proposed here is a classifier-based fe

... Show More
View Publication Preview PDF
Scopus Crossref
Publication Date
Wed Aug 01 2018
Journal Name
Journal Of Economics And Administrative Sciences
Proposed Measure for Effect Size in Mediation Analysis with Solution to Some Mediation Process Problems
...Show More Authors

In this paper, the effect size measures was discussed, which are useful in many estimation processes for direct effect and its relation with indirect and total effects. In addition, an algorithm to calculate the suggested measure of effect size was suggested that represent the ratio of direct effect to the effect of the estimated parameter using the Regression equation of the dependent variable on the mediator variable without using the independent variable in the model. Where this an algorithm clear the possibility to use this regression equation in Mediation Analysis, where usually used the Mediator and independent variable together when the dependent variable regresses on them. Also this an algorithm to show how effect of the

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Aug 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of the performance of some r- (k,d) class estimators with the (PCTP) estimator that used in estimating the general linear regression model in the presence of autocorrelation and multicollinearity problems at the same time "
...Show More Authors

In the analysis of multiple linear regression, the problem of multicollinearity and auto-correlation drew the attention of many researchers, and given the appearance of these two problems together and their bad effect on the estimation, some of the researchers found new methods to address these two problems together at the same time. In this research a comparison for the performance of the Principal Components Two Parameter estimator (PCTP) and The (r-k) class estimator and the r-(k,d) class estimator by conducting a simulation study and through the results and under the mean square error (MSE) criterion to find the best way to address the two problems together. The results showed that the r-(k,d) class estimator is the best esti

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Oct 01 2019
Journal Name
International Journal Of Drug Delivery Technology
Simple and Rapid Method For Estimate of Propranolol With Bi (III) Via Long-Distance Chasing Photometer (NAG-ADF-300-2) Utilization Continuous Flow Injection Analysis
...Show More Authors

A simple, sensitive and rapid method was used for the estimate of: Propranolol with Bi (III) to prove the efficiency, reliability and repeatability of the long distance chasing photometer (NAG-ADF-300-2) using continuous flow injection analysis. The method is based on a reaction between propranolol and Bi (III) in an aqueous medium to obtain a yellow precipitate. Optimum parameters were studied to increase the sensitivity for the developed method. A linear range for calibration graph was 0.1-25 mmol/L for cell A and 1-40 mmol/L for cell B, and LOD 51.8698 ng/200 µL and 363.0886 ng /200 µL , respectively to cell A and cell B with correlation coefficient (r) 0.9975 for cell A, 0.9966 for cell B, RSD% was lower than 1%, (n = 8) for the

... Show More
Publication Date
Thu Oct 01 2015
Journal Name
Journal Of Economics And Administrative Sciences
Estimation Multivariate data points in spatial statistics with application
...Show More Authors

This paper  deals  to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in th

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon May 28 2018
Journal Name
Iraqi Journal Of Science
Alternative Approach to Extract the Bulk Etching Rate of PADC Nuclear Detector
...Show More Authors

     The paper aims to propose the maximum track length (Lmax) measurement as an alternative approach to evaluate and extract the bulk etch rate (Vb) of the nuclear detector PADC CR-39, and compare it with the results obtained by the removal layer thickness measurement of the etched detector. The alternative Lmax-method mainly relies on the measuring the length of the etched tracks, their maximum values and saturation times from the obtained track profile images. The detectors were irradiated with different energies of alpha particles emitted from the 241Am source and then etched in a 6.5 N NaOH solution at 70±1oC for different successive time intervals. In

... Show More
View Publication Preview PDF
Publication Date
Wed Oct 28 2020
Journal Name
Iraqi Journal Of Science
Community Detection under Stochastic Block Model Likelihood Optimization via Tabu Search –Fuzzy C-Mean Method for Social Network Data
...Show More Authors

     Structure of network, which is known as community detection in networks, has received a great attention in diverse topics, including social sciences, biological studies, politics, etc. There are a large number of studies and practical approaches that were designed to solve the problem of finding the structure of the network. The definition of complex network model based on clustering is a non-deterministic polynomial-time hardness (NP-hard) problem. There are no ideal techniques to define the clustering. Here, we present a statistical approach based on using the likelihood function of a Stochastic Block Model (SBM). The objective is to define the general model and select the best model with high quality. Therefor

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Crossref
Publication Date
Wed Jan 01 2020
Journal Name
Journal Of Southwest Jiaotong University
An Improved Diffie-Hellman Protocol Security Using Video Entropy
...Show More Authors

The Diffie-Hellman is a key exchange protocol to provide a way to transfer shared secret keys between two parties, although those parties might never have communicated together. This paper suggested a new way to transfer keys through public or non-secure channels depending on the sent video files over the channel and then extract keys. The proposed method of key generation depends on the video file content by using the entropy value of the video frames. The proposed system solves the weaknesses in the Diffie-Hellman key exchange algorithm, which is MIMA (Man-in-the-Middle attack) and DLA( Discrete logarithm attack). When the method used high definition videos with a vast amount of data, the keys generated with a large number up to 5

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Oct 01 2011
Journal Name
Journal Of Engineering
IMPROVED IMAGE COMPRESSION BASED WAVELET TRANSFORM AND THRESHOLD ENTROPY
...Show More Authors

In this paper, a method is proposed to increase the compression ratio for the color images by
dividing the image into non-overlapping blocks and applying different compression ratio for these
blocks depending on the importance information of the block. In the region that contain important
information the compression ratio is reduced to prevent loss of the information, while in the
smoothness region which has not important information, high compression ratio is used .The
proposed method shows better results when compared with classical methods(wavelet and DCT).

View Publication Preview PDF
Crossref