Preferred Language
Articles
/
jeasiq-1069
Comparison Robust M Estimate With Cubic Smoothing Splines For Time-Varying Coefficient Model For Balance Longitudinal Data
...Show More Authors

In this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of  specific time points (m)،since the frequent measurements within the subjects are almost connected and independent among the different subjects

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Thu Aug 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Some NONPARAMETRIC ESTIMATORS FOR RIGHT CENSORED SURVIVAL DATA
...Show More Authors

The using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible parametric models and these models were nonparametric, many researchers, are interested in the study of the function of permanence and its estimation methods, one of these non-parametric methods.

For work of purpose statistical inference parameters around the statistical distribution for life times which censored data , on the experimental section of this thesis has been the comparison of non-parametric methods of permanence function, the existence

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sat Sep 08 2018
Journal Name
Proceedings Of The 2018 International Conference On Computing And Big Data
3D Parallel Coordinates for Multidimensional Data Cube Exploration
...Show More Authors

Visual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu

... Show More
View Publication
Scopus (3)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Wed Dec 01 2021
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of Some Non-Parametric Quality Control Methods
...Show More Authors

    Multivariate Non-Parametric control charts were used to monitoring the data that generated by using the simulation, whether they are within control limits or not. Since that non-parametric methods do not require any assumptions about the distribution of the data.  This research aims to apply the multivariate non-parametric quality control methods, which are Multivariate Wilcoxon Signed-Rank ( ) , kernel principal component analysis (KPCA) and k-nearest neighbor (

View Publication Preview PDF
Crossref
Publication Date
Mon Aug 01 2022
Journal Name
Baghdad Science Journal
Perceptually Important Points-Based Data Aggregation Method for Wireless Sensor Networks
...Show More Authors

The transmitting and receiving of data consume the most resources in Wireless Sensor Networks (WSNs). The energy supplied by the battery is the most important resource impacting WSN's lifespan in the sensor node. Therefore, because sensor nodes run from their limited battery, energy-saving is necessary. Data aggregation can be defined as a procedure applied for the elimination of redundant transmissions, and it provides fused information to the base stations, which in turn improves the energy effectiveness and increases the lifespan of energy-constrained WSNs. In this paper, a Perceptually Important Points Based Data Aggregation (PIP-DA) method for Wireless Sensor Networks is suggested to reduce redundant data before sending them to the

... Show More
View Publication Preview PDF
Scopus (57)
Crossref (48)
Scopus Clarivate Crossref
Publication Date
Sun Jun 15 2025
Journal Name
Journal Of Engineering
Calculation Of Volumeteric And Thermodynamic Properties For Pure Hydrocarbons And Their Mixtures Using Cubic Equations Of State
...Show More Authors

View Publication
Publication Date
Thu Dec 31 2015
Journal Name
Al-khwarizmi Engineering Journal
Prediction of Performance Equations for Household Compressors Depending on Manufacturing Data for Refrigerators and Freezers
...Show More Authors

Abstract

 A surface fitting model is developed based on calorimeter data for two famous brands of household compressors. Correlation equations of ten coefficient polynomials were found as a function of refrigerant saturating and evaporating temperatures in range of (-35℃ to -10℃) using Matlab software for cooling capacity, power consumption, and refrigerant mass flow rate.

Additional correlations equations for these variables as a quick choice selection for a proper compressor use at ASHRAE standard that cover a range of swept volume range (2.24-11.15) cm3.

The result indicated that these surface fitting models are accurate with in ± 15% for 72 compressors model of cooling cap

... Show More
View Publication Preview PDF
Publication Date
Wed Dec 01 2021
Journal Name
International Journal Bioautomation
Model for Prediction of the Weight and Height Measurements of Patients with Disabilities for Diagnosis and Therapy
...Show More Authors

Background: Accurate measurement of a patient’s height and weight is an essential part of diagnosis and therapy, but there is some controversy as to how to calculate the height and weight of patients with disabilities. Objective: This study aims to use anthropometric measurements (arm span, length of leg, chest circumference, and waist circumference) to find a model (alternatives) that can allow the calculation of the height and the body weight of patients with disabilities. Additionally, a model for the prediction of weight and height measurements of patients with disabilities was established. Method: Four hander patients aged 20-80 years were enrolled in this study and divided into two groups, 210 (52.5%) male and 190 (47.5%) fe

... Show More
View Publication
Scopus (2)
Crossref (2)
Scopus Crossref
Publication Date
Wed Jun 30 2021
Journal Name
Journal Of Economics And Administrative Sciences
comparison Bennett's inequality and regression in determining the optimum sample size for estimating the Net Reclassification Index (NRI) using simulation
...Show More Authors

 Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Dec 29 2016
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Comparison Between Two Approaches (MLE &DLS) to Estimate Frechet Poisson Lindley Distribution Compound by Using Simulation
...Show More Authors

  In this paper simulation technique plays a vital role to compare between two approaches Maximum Likelihood method and Developed Least Square method to estimate the parameters of Frechet Poisson Lindley Distribution Compound. by coding using Matlab software program. Also, under different sample sizes via mean square error. As the results which obtain that Maximum Likelihood Estimation method is better than Developed Least Square method to estimate these parameters to the proposed distribution.

View Publication Preview PDF
Publication Date
Wed Mar 11 2020
Journal Name
Journal Of Cardiovascular Disease Research
Resistive Index is an Early Indicator for Flow Deterioration in Comparison with Intima Media Thickness for Patients with Hypertension and Diabetes in Relation to Age
...Show More Authors

Background: Atherosclerosis is well known related to age and certain cardiovascular diseases. Aging is one reason of arteries function deterioration which can cause loss of compliance and plaque accumulation, this effect increases by the presence of certain diseases such as hypertension and diabetes disease. Aim: To investigate the reduction of blood supply to the brain in patients with diabetes and hypertension with age and the role of resistive index in the diagnosis of reduced blood flow. Method: Patients with both diseases diabetic and hypertension were classified according to their age to identify the progression of the disease and factors influencing the carotid artery blood flow. By using ultrasound and standard Doppler techniq

... Show More
View Publication
Scopus (2)
Crossref (1)
Scopus Crossref