This paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.
In this research estimated the parameters of Gumbel distribution Type 1 for Maximum values through the use of two estimation methods:- Moments (MoM) and Modification Moments(MM) Method. the Simulation used for comparison between each of the estimation methods to reach the best method to estimate the parameters where the simulation was to generate random data follow Gumbel distributiondepending on three models of the real values of the parameters for different sample sizes with samples of replicate (R=500).The results of the assessment were put in tables prepared for the purpose of comparison, which made depending on the mean squares error (MSE).
Background:Measurement of hemoglobin A1c (A1C) is a renowned tactic for gauging long-term glycemic control, and exemplifies an outstanding influence to the quality of care in diabetic patients.The concept of targets is open to criticism; they may be unattainable, or limit what could be attained, and in addition they may be economically difficult to attain. However, without some form of targeted control of an asymptomatic condition it becomes difficult to promote care at allObjectives: The present article aims to address the most recent evidence-based global guidelines of A1C targets intended for glycemic control in Type 2 Diabetes Mellitus (T2D).Key messages:Rationale for Treatment Targets of A1C includesevidence for microvascular and ma
... Show MoreMost statistical research generally relies on the study of the behaviour of different phenomena during specific time periods and the use of the results of these studies in the development of appropriate recommendations and decision-making and for the purpose of statistical inference on the parameters of the statistical distribution of life times in The technical staff of most of the manufacturers in the research units of these companies deals with censored data, the main objective of the study of survival is the need to provide information that is the basis for decision making and must clarify the problem and then the goals and limitations of this study and that It may have different possibilities to perform the
... Show MoreSurvival analysis is the analysis of data that are in the form of times from the origin of time until the occurrence of the end event, and in medical research, the origin of time is the date of registration of the individual or the patient in a study such as clinical trials to compare two types of medicine or more if the endpoint It is the death of the patient or the disappearance of the individual. The data resulting from this process is called survival times. But if the end is not death, the resulting data is called time data until the event. That is, survival analysis is one of the statistical steps and procedures for analyzing data when the adopted variable is time to event and time. It could be d
... Show MoreThe Internet of Things (IoT) is an information network that connects gadgets and sensors to allow new autonomous tasks. The Industrial Internet of Things (IIoT) refers to the integration of IoT with industrial applications. Some vital infrastructures, such as water delivery networks, use IIoT. The scattered topology of IIoT and resource limits of edge computing provide new difficulties to traditional data storage, transport, and security protection with the rapid expansion of the IIoT. In this paper, a recovery mechanism to recover the edge network failure is proposed by considering repair cost and computational demands. The NP-hard problem was divided into interdependent major and minor problems that could be solved in polynomial t
... Show MoreComputer systems and networks are increasingly used for many types of applications; as a result the security threats to computers and networks have also increased significantly. Traditionally, password user authentication is widely used to authenticate legitimate user, but this method has many loopholes such as password sharing, brute force attack, dictionary attack and more. The aim of this paper is to improve the password authentication method using Probabilistic Neural Networks (PNNs) with three types of distance include Euclidean Distance, Manhattan Distance and Euclidean Squared Distance and four features of keystroke dynamics including Dwell Time (DT), Flight Time (FT), mixture of (DT) and (FT), and finally Up-Up Time (UUT). The resul
... Show MorePattern matching algorithms are usually used as detecting process in intrusion detection system. The efficiency of these algorithms is affected by the performance of the intrusion detection system which reflects the requirement of a new investigation in this field. Four matching algorithms and a combined of two algorithms, for intrusion detection system based on new DNA encoding, are applied for evaluation of their achievements. These algorithms are Brute-force algorithm, Boyer-Moore algorithm, Horspool algorithm, Knuth-Morris-Pratt algorithm, and the combined of Boyer-Moore algorithm and Knuth–Morris– Pratt algorithm. The performance of the proposed approach is calculated based on the executed time, where these algorithms are applied o
... Show MoreIntrusion detection systems detect attacks inside computers and networks, where the detection of the attacks must be in fast time and high rate. Various methods proposed achieved high detection rate, this was done either by improving the algorithm or hybridizing with another algorithm. However, they are suffering from the time, especially after the improvement of the algorithm and dealing with large traffic data. On the other hand, past researches have been successfully applied to the DNA sequences detection approaches for intrusion detection system; the achieved detection rate results were very low, on other hand, the processing time was fast. Also, feature selection used to reduce the computation and complexity lead to speed up the system
... Show MoreHiding technique for dynamic encryption text using encoding table and symmetric encryption method (AES algorithm) is presented in this paper. The encoding table is generated dynamically from MSB of the cover image points that used as the first phase of encryption. The Harris corner point algorithm is applied on cover image to generate the corner points which are used to generate dynamic AES key to second phase of text encryption. The embedded process in the LSB for the image pixels except the Harris corner points for more robust. Experimental results have demonstrated that the proposed scheme have embedding quality, error-free text recovery, and high value in PSNR.
An aircraft's landing stage involves inherent hazards and problems associated with many factors, such as weather, runway conditions, pilot experiences, etc. The pilot is responsible for selecting the proper landing procedure based on information provided by the landing console operator (LCO). Given the likelihood of human decisions due to errors and biases, creating an intelligent system becomes important to predict accurate decisions. This paper proposes the fuzzy logic method, which intends to handle the uncertainty and ambiguity inherent in the landing phase, providing intelligent decision support to the pilot while reducing the workload of the LCO. The fuzzy system, built using the Mamdani approach in MATLAB software, considers critical
... Show More