n this research, several estimators concerning the estimation are introduced. These estimators are closely related to the hazard function by using one of the nonparametric methods namely the kernel function for censored data type with varying bandwidth and kernel boundary. Two types of bandwidth are used: local bandwidth and global bandwidth. Moreover, four types of boundary kernel are used namely: Rectangle, Epanechnikov, Biquadratic and Triquadratic and the proposed function was employed with all kernel functions. Two different simulation techniques are also used for two experiments to compare these estimators. In most of the cases, the results have proved that the local bandwidth is the best for all the types of the kernel boundary functions and suggested that the 2xRectangle and 2xEpanechnikov methods reflect the best results if compared to the other estimators
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreIn this study, we focused on the random coefficient estimation of the general regression and Swamy models of panel data. By using this type of data, the data give a better chance of obtaining a better method and better indicators. Entropy's methods have been used to estimate random coefficients for the general regression and Swamy of the panel data which were presented in two ways: the first represents the maximum dual Entropy and the second is general maximum Entropy in which a comparison between them have been done by using simulation to choose the optimal methods.
The results have been compared by using mean squares error and mean absolute percentage error to different cases in term of correlation valu
... Show MoreBack ground: Diabetic nephropathy is rapidly becoming the leading cause of end-stage renal disease (ESRD). The onset and course of DN can be ameliorated to a very significant degree if intervention institutes at a point very early in the course of the development of this complication.
Objective: The aim of this study was to characterize risk factors associated with nephropathy in type I diabetes and construct a module for early prediction of diabetic nephropathy (DN) by analyzing their risk factors.
Methods: Case control design of 400 patients with type I diabetes mellitus (IDDM), aged 19-45 years. The cases were 200 diabetic patients with overt protein urea while the controls were 200 diabetic patients with no protein urea or micr
Type-1 diabetes is defined as destruction of pancreatic beta cell, virus and bacteria are some environmental factor for this disease. The study included 25 patients with type-1 diabetes mellitus aged between 8 – 25 years from Baghdad hospital and 20 healthy persons as control group. Anti-rubella IgG and IgM, anti-Chlamydia pneumonia IgG and IgM were measured by ELISA technique while anti-CMV antibody were measured by immunofluorescence technique. The aim of current study was to know the trigger factor for type-1 diabetes. There were significant differences (P<0.05) between studied groups according to parameters and the results lead to suggest that Chlamydia pneumonia, CMV and rubella virus may trigger type-1 diabetes mellitus in Iraqi pat
... Show MoreEntropy define as uncertainty measure has been transfared by using the cumulative distribution function and reliability function for the Burr type – xii. In the case of data which suffer from volatility to build a model the probability distribution on every failure of a sample after achieving limitations function, probabilistic distribution. Has been derived formula probability distribution of the new transfer application entropy on the probability distribution of continuous Burr Type-XII and tested a new function and found that it achieved the conditions function probability, been derived mean and function probabilistic aggregate in order to be approved in the generation of data for the purpose of implementation of simulation
... Show MoreBackground: Diabetes mellitus is a major risk factor for chronic periodontitis (CP) and hyperglycemia has an important role in the enhancement of the severity of the periodontitis. It has been reported that the progression of CP causes shifting of the balance between bone formation and resorption toward osteoclastic resorption, and this will lead to the release of collagenous bone breakdown products into the local tissues and the systemic circulation. Cross-linked N-telopeptide of type I collagen (NTx) is the amino-terminal peptides of type I collagen which is released during the process of bone resorption. This study was conducted to determine the effects of nonsurgical periodontal therapy on serum level of NTx in type 2 diabetic patients
... Show MoreThe goal (purpose) from using development technology that require mathematical procedure related with high Quality & sufficiency of solving complex problem called Dynamic Programming with in recursive method (forward & backward) through finding series of associated decisions for reliability function of Pareto distribution estimator by using two approach Maximum likelihood & moment .to conclude optimal policy
The equation of Kepler is used to solve different problems associated with celestial mechanics and the dynamics of the orbit. It is an exact explanation for the movement of any two bodies in space under the effect of gravity. This equation represents the body in space in terms of polar coordinates; thus, it can also specify the time required for the body to complete its period along the orbit around another body. This paper is a review for previously published papers related to solve Kepler’s equation and eccentric anomaly. It aims to collect and assess changed iterative initial values for eccentric anomaly for forty previous years. Those initial values are tested to select the finest one based on the number of iterations, as well as the
... Show MoreDatabase is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show More