Preferred Language
Articles
/
lhdMfJIBVTCNdQwC9LH4
Hazard Rate Estimation Using Varying Kernel Function for Censored Data Type I Article Sidebar

n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary functions and suggested that the 2xRectangle and 2xEpanechnikov methods reflect the best results if compared to the other estimators

Scopus Clarivate Crossref
View Publication
Publication Date
Thu Dec 01 2022
Journal Name
Microbiology And Biotechnology Letters
Scopus (1)
Scopus Crossref
View Publication Preview PDF
Publication Date
Tue Jan 01 2019
Journal Name
Journal Of Global Pharma Technology
Serum levels of cross-linked n-telopeptide of Type i collagen before and after non-surgical periodontal therapy in type 2 diabetic patients with chronic periodontitis

Background: Diabetes mellitus is a major risk factor for chronic periodontitis (CP) and hyperglycemia has an important role in the enhancement of the severity of the periodontitis. It has been reported that the progression of CP causes shifting of the balance between bone formation and resorption toward osteoclastic resorption, and this will lead to the release of collagenous bone breakdown products into the local tissues and the systemic circulation. Cross-linked N-telopeptide of type I collagen (NTx) is the amino-terminal peptides of type I collagen which is released during the process of bone resorption. This study was conducted to determine the effects of nonsurgical periodontal therapy on serum level of NTx in type 2 diabetic patients

... Show More
Publication Date
Mon Aug 26 2019
Journal Name
Iraqi Journal Of Science
Finding Best Clustering For Big Networks with Minimum Objective Function by Using Probabilistic Tabu Search

     Fuzzy C-means (FCM) is a clustering method used for collecting similar data elements within the group according to specific measurements. Tabu is a heuristic algorithm. In this paper, Probabilistic Tabu Search for FCM implemented to find a global clustering based on the minimum value of the Fuzzy objective function. The experiments designed for different networks, and cluster’s number the results show the best performance based on the comparison that is done between the values of the objective function in the case of using standard FCM and Tabu-FCM, for the average of ten runs.

Scopus (1)
Crossref (1)
Scopus Crossref
View Publication Preview PDF
Publication Date
Thu Dec 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Determine the optimal policy for the function of Pareto distribution reliability estimated using dynamic programming

The goal (purpose) from using development technology that require mathematical procedure related with high Quality & sufficiency of solving complex problem called Dynamic Programming with in recursive method (forward & backward) through  finding series of associated decisions for reliability function of Pareto distribution estimator by using two approach Maximum likelihood & moment .to conclude optimal policy

Crossref
View Publication
Publication Date
Sun Oct 22 2023
Journal Name
Iraqi Journal Of Science
Characteristics of Electrical Power Generation by Wind for Al-Tweitha Location Using Weibull Distribution Function

In this paper, the 5 minutes measured wind speed data for year 2012 at 10 meter height for Tweitha have been statically analyzed to assess the time of wind turbine electrical power generation. After collection Tweitha wind data and calculation of mean wind speed the cumulative Weibull diagram and probability density function was ploted, then each of cumulative Weibull distribution, cut-in and furling turbine wind speed could be used as a mathematical input parameters in order to estimate the hours of electrical power generation for wind turbine during one day or one year. In Tweitha site, found that the average wind speed was (v= 1.76 m/s), so five different wind turbines were be selected to calculate hours of electrical generation for A

... Show More
View Publication Preview PDF
Publication Date
Thu May 18 2023
Journal Name
Journal Of Engineering
Experimental Study for Materials Prosthetic above Knee Socket under Tensile or Fatigue Stress with Varying Temperatures Effect

The residual limb within the prosthesis, is often subjected to tensile or fatigue stress with varying temperatures. The fatigue stress and temperatures difference which faced by amputee during his daily activities will produces an environmental media for growth of fungi and bacteria in addition to the damage that occurs in the prosthesis which minimizingthe life of the prosthetic limb and causing disconfirm feeling for the amputee.

In this paper, a mechanical and thermal properties of composite materials prosthetic socket made of different lamination for perlon/fiber glass/perlon, are calculated by using tesile test device under varying temperatures ( from 20oC to 60oC), also in this paper a device for measuring rotational bendin

... Show More
Crossref (3)
Crossref
View Publication Preview PDF
Publication Date
Fri Feb 01 2019
Journal Name
Journal Of Economics And Administrative Sciences
Comparison of estimations methods of the entropy function to the random coefficients for two models: the general regression and swamy of the panel data

In this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.

The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu

... Show More
Crossref
View Publication Preview PDF
Publication Date
Wed Jan 01 2020
Journal Name
Research Journal Of Pharmacy And Technology
Scopus (4)
Crossref (3)
Scopus Crossref
View Publication
Publication Date
Sat May 09 2015
Journal Name
International Journal Of Innovations In Scientific Engineering
USING ARTIFICIAL NEURAL NETWORK TECHNIQUE FOR THE ESTIMATION OF CD CONCENTRATION IN CONTAMINATED SOILS

The aim of this paper is to design artificial neural network as an alternative accurate tool to estimate concentration of Cadmium in contaminated soils for any depth and time. First, fifty soil samples were harvested from a phytoremediated contaminated site located in Qanat Aljaeesh in Baghdad city in Iraq. Second, a series of measurements were performed on the soil samples. The inputs are the soil depth, the time, and the soil parameters but the output is the concentration of Cu in the soil for depth x and time t. Third, design an ANN and its performance was evaluated using a test data set and then applied to estimate the concentration of Cadmium. The performance of the ANN technique was compared with the traditional laboratory inspecting

... Show More
View Publication
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
Crossref
View Publication Preview PDF