n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary functions and suggested that the 2xRectangle and 2xEpanechnikov methods reflect the best results if compared to the other estimators
Background: Diabetes mellitus is a major risk factor for chronic periodontitis (CP) and hyperglycemia has an important role in the enhancement of the severity of the periodontitis. It has been reported that the progression of CP causes shifting of the balance between bone formation and resorption toward osteoclastic resorption, and this will lead to the release of collagenous bone breakdown products into the local tissues and the systemic circulation. Cross-linked N-telopeptide of type I collagen (NTx) is the amino-terminal peptides of type I collagen which is released during the process of bone resorption. This study was conducted to determine the effects of nonsurgical periodontal therapy on serum level of NTx in type 2 diabetic patients
... Show MoreFuzzy C-means (FCM) is a clustering method used for collecting similar data elements within the group according to specific measurements. Tabu is a heuristic algorithm. In this paper, Probabilistic Tabu Search for FCM implemented to find a global clustering based on the minimum value of the Fuzzy objective function. The experiments designed for different networks, and cluster’s number the results show the best performance based on the comparison that is done between the values of the objective function in the case of using standard FCM and Tabu-FCM, for the average of ten runs.
The goal (purpose) from using development technology that require mathematical procedure related with high Quality & sufficiency of solving complex problem called Dynamic Programming with in recursive method (forward & backward) through finding series of associated decisions for reliability function of Pareto distribution estimator by using two approach Maximum likelihood & moment .to conclude optimal policy
In this paper, the 5 minutes measured wind speed data for year 2012 at 10 meter height for Tweitha have been statically analyzed to assess the time of wind turbine electrical power generation. After collection Tweitha wind data and calculation of mean wind speed the cumulative Weibull diagram and probability density function was ploted, then each of cumulative Weibull distribution, cut-in and furling turbine wind speed could be used as a mathematical input parameters in order to estimate the hours of electrical power generation for wind turbine during one day or one year. In Tweitha site, found that the average wind speed was (v= 1.76 m/s), so five different wind turbines were be selected to calculate hours of electrical generation for A
... Show MoreThe residual limb within the prosthesis, is often subjected to tensile or fatigue stress with varying temperatures. The fatigue stress and temperatures difference which faced by amputee during his daily activities will produces an environmental media for growth of fungi and bacteria in addition to the damage that occurs in the prosthesis which minimizingthe life of the prosthetic limb and causing disconfirm feeling for the amputee.
In this paper, a mechanical and thermal properties of composite materials prosthetic socket made of different lamination for perlon/fiber glass/perlon, are calculated by using tesile test device under varying temperatures ( from 20oC to 60oC), also in this paper a device for measuring rotational bendin
... Show MoreIn this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.
The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu
... Show MoreThe aim of this paper is to design artificial neural network as an alternative accurate tool to estimate concentration of Cadmium in contaminated soils for any depth and time. First, fifty soil samples were harvested from a phytoremediated contaminated site located in Qanat Aljaeesh in Baghdad city in Iraq. Second, a series of measurements were performed on the soil samples. The inputs are the soil depth, the time, and the soil parameters but the output is the concentration of Cu in the soil for depth x and time t. Third, design an ANN and its performance was evaluated using a test data set and then applied to estimate the concentration of Cadmium. The performance of the ANN technique was compared with the traditional laboratory inspecting
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More