This paper discusses estimating the two scale parameters of Exponential-Rayleigh distribution for singly type one censored data which is one of the most important Rights censored data, using the maximum likelihood estimation method (MLEM) which is one of the most popular and widely used classic methods, based on an iterative procedure such as the Newton-Raphson to find estimated values for these two scale parameters by using real data for COVID-19 was taken from the Iraqi Ministry of Health and Environment, AL-Karkh General Hospital. The duration of the study was in the interval 4/5/2020 until 31/8/2020 equivalent to 120 days, where the number of patients who entered the (study) hospital with sample size is (n=785). The number of patients who died during the period of study was (m=88). And the number of patients who survived during the study period was (n-m=697), then utilized one of the most important non-parametric tests which is the Chi-square test to determine if the sample (data) corresponded with the Exponential-Rayleigh distribution (ER). then, after estimating the parameters of ER distribution for singly type-I censoring data, compute the survival function, hazard function, and probability density function.
Today, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show MoreThis research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear
... Show MoreDiabetes mellitus type 2 (T2DM) is a chronic and progressive condition, which affects people all around the world. The risk of complications increases with age if the disease is not managed properly. Diabetic neuropathy is caused by excessive blood glucose and lipid levels, resulting in nerve damage. Apelin is a peptide hormone that is found in different human organs, including the central nervous system and adipose tissue. The aim of this study is to estimate Apelin levels in diabetes type 2 and Diabetic peripheral Neuropathy (DPN) Iraqi patients and show the extent of peripheral nerve damage. The current study included 120 participants: 40 patients with Diabetes Mellitus, 40 patients with Diabetic peripheral Neuropathy, and 40 healthy
... Show MoreObjective This research investigates Breast Cancer real data for Iraqi women, these data are acquired manually from several Iraqi Hospitals of early detection for Breast Cancer. Data mining techniques are used to discover the hidden knowledge, unexpected patterns, and new rules from the dataset, which implies a large number of attributes. Methods Data mining techniques manipulate the redundant or simply irrelevant attributes to discover interesting patterns. However, the dataset is processed via Weka (The Waikato Environment for Knowledge Analysis) platform. The OneR technique is used as a machine learning classifier to evaluate the attribute worthy according to the class value. Results The evaluation is performed using
... Show MoreIn this paper, Nordhaus-Gaddum type relations on open support independence number of some derived graphs of path related graphs under addition and multiplication are studied.
The main objective of this research is to design and select a composite plate to be used in fabricating wing skins of light unman air vehicle (UAV). The mechanical properties, weight and cost are the basis criteria of this selection. The fiber volume fraction, fillers and type of fiber with three levels for each were considered to optimize the composite plate selection. Finite element method was used to investigate the stress distribution on the wing at cruise flight condition in addition to estimate the maximum stress. An experiments plan has been designed to get the data on the basis of Taguchi technique. The most effective parameters at the process to be find out by employing L9
... Show MoreAtenolol was used with ammonium molybdate to prove the efficiency, reliability and repeatability of the long distance chasing photometer (NAG-ADF-300-2) using continuous flow injection analysis. The method is based on reaction between atenolol and ammonium molybdate in an aqueous medium to obtain a dark brown precipitate. Optimum parameters was studied to increase the sensitivity for developed method. A linear range for calibration graph was 0.1-3.5 mmol/L for cell A and 0.3-3.5 mmol/L for cell B, and LOD 133.1680 ng/100 µL and 532.6720 ng/100 µL for cell A and cell B respectively with correlation coefficient (r) 0.9910 for cell A and 0.9901 for cell B, RSD% was lower than 1%, (n=8) for the determination of ate
... Show MoreThere has been a growing interest in the use of chaotic techniques for enabling secure communication in recent years. This need has been motivated by the emergence of a number of wireless services which require the channel to provide low bit error rates (BER) along with information security. The aim of such activity is to steal or distort the information being conveyed. Optical Wireless Systems (basically Free Space Optic Systems, FSO) are no exception to this trend. Thus, there is an urgent necessity to design techniques that can secure privileged information against unauthorized eavesdroppers while simultaneously protecting information against channel-induced perturbations and errors. Conventional cryptographic techniques are not designed
... Show MoreThis study focused on determining the markers of Macrophage migration inhibitor (MIF), as well as the N-telopeptides of type I bone collagen (NTX), and some other parameters (alkaline phosphatase (ALP), vitamin D (Vit D), calcium (Ca), phosphorus (P), and magnesium (Mg), and their correlation with other parameters in osteoporosis. One hundred ten subjects were involved in the current study. There were two groups of patients: group I (30) women with severe osteoporosis and group II (30) women with mild osteoporosis. For comparison, 50 apparently healthy individuals were included as a control. Serum levels of MIF, and NTX were significantly higher in groups I and II as compared to the control group, which indicate that these two parameters
... Show MoreIn this research work, a simulator with time-domain visualizers and configurable parameters using a continuous time simulation approach with Matlab R2019a is presented for modeling and investigating the performance of optical fiber and free-space quantum channels as a part of a generic quantum key distribution system simulator. The modeled optical fiber quantum channel is characterized with a maximum allowable distance of 150 km with 0.2 dB/km at =1550nm. While, at =900nm and =830nm the attenuation values are 2 dB/km and 3 dB/km respectively. The modeled free space quantum channel is characterized at 0.1 dB/km at =860 nm with maximum allowable distance of 150 km also. The simulator was investigated in terms of the execution of the BB84 p
... Show More