In this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show MoreUse of lower squares and restricted boxes
In the estimation of the first-order self-regression parameter
AR (1) (simulation study)
Excessive skewness which occurs sometimes in the data is represented as an obstacle against normal distribution. So, recent studies have witnessed activity in studying the skew-normal distribution (SND) that matches the skewness data which is regarded as a special case of the normal distribution with additional skewness parameter (α), which gives more flexibility to the normal distribution. When estimating the parameters of (SND), we face the problem of the non-linear equation and by using the method of Maximum Likelihood estimation (ML) their solutions will be inaccurate and unreliable. To solve this problem, two methods can be used that are: the genetic algorithm (GA) and the iterative reweighting algorithm (IR) based on the M
... Show More
CD-nanosponges were prepared by crosslinking B-CD with diphenylcarbonate (DPC) using ultrasound assisted technique. 5-FU was incorporated with NS by freeze drying, and the phase solubility study, complexation efficiency (CE) entrapment efficiency were performed. Also, the particle morphology was studied using SEM and AFM. The in-vitro release of 5-FU from the prepared nanosponges was carried out in 0.1N HCl.
5-FU nanosponges particle size was in the nano size. The optimum formula showed a particle size of (405.46±30) nm, with a polydispersity index (PDI) (0.328±0.002) and a negative zeta potential (-18.75±1.8). Also the drug entrapment efficiency varied with the CD: DPC molar ratio from 15.6 % to 30%. The SEM an
... Show MoreThis paper presents a hybrid software copy protection scheme, the scheme is applied to
prevent illegal copying of software by produce a license key which is unique and easy to
generate. This work employs the uniqueness of identification of hard disk in personal
computer which can get by software to create a license key after treated with SHA-1 one way
hash function. Two mean measures are used to evaluate the proposed method, complexity
and processing time, SHA-1 can insure the high complexity to deny the hackers for produce
unauthorized copies, many experiments have been executed using different sizes of software
to calculate the consuming time. The measures show high complexity and short execution
time for propos
In this paper, first we refom1Ulated the finite element model
(FEM) into a neural network structure using a simple two - dimensional problem. The structure of this neural network is described
, followed by its application to solving the forward and inverse problems. This model is then extended to the general case and the advantages and di sadvantages of this approach are descri bed along with an analysis of the sensi tivity of
... Show MoreLongitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreThe progress of science in all its branches and levels made great civilized changes of
our societies in the present day, it's a result of the huge amount of knowledge, the increase of
number of students, and the increase of community awareness proportion of the importance of
education in schools and universities, it became necessary for us as educators to look at
science from another point of view based on the idea of scientific development of curricula
and teaching methods and means of education, and for the studying class environment as a
whole, by computer and internet use in education to the emergence of the term education
technology, which relies on the use of modern technology to provide educational content to<
This Research deals with estimation the reliability function for two-parameters Exponential distribution, using different estimation methods ; Maximum likelihood, Median-First Order Statistics, Ridge Regression, Modified Thompson-Type Shrinkage and Single Stage Shrinkage methods. Comparisons among the estimators were made using Monte Carlo Simulation based on statistical indicter mean squared error (MSE) conclude that the shrinkage method perform better than the other methods