In this paper, a Monte Carlo Simulation technique is used to compare the performance of MLE and the standard Bayes estimators of the reliability function of the one parameter exponential distribution.Two types of loss functions are adopted, namely, squared error loss function (SELF) and modified square error loss function (MSELF) with informative and non- informative prior. The criterion integrated mean square error (IMSE) is employed to assess the performance of such estimators .
The paper proposes a methodology for predicting packet flow at the data plane in smart SDN based on the intelligent controller of spike neural networks(SNN). This methodology is applied to predict the subsequent step of the packet flow, consequently reducing the overcrowding that might happen. The centralized controller acts as a reactive controller for managing the clustering head process in the Software Defined Network data layer in the proposed model. The simulation results show the capability of Spike Neural Network controller in SDN control layer to improve the (QoS) in the whole network in terms of minimizing the packet loss ratio and increased the buffer utilization ratio.
The reaction oisolated and characterized by elemental analysis (C,H,N) , 1H-NMR, mass spectra and Fourier transform (Ft-IR). The reaction of the (L-AZD) with: [VO(II), Cr(III), Mn(II), Co(II), Ni(II), Cu(II), Zn(II), Cd(II) and Hg(II)], has been investigated and was isolated as tri nuclear cluster and characterized by: Ft-IR, U. v- Visible, electrical conductivity, magnetic susceptibilities at 25 Co, atomic absorption and molar ratio. Spectroscopic evidence showed that the binding of metal ions were through azide and carbonyl moieties resulting in a six- coordinating metal ions in [Cr (III), Mn (II), Co (II) and Ni (II)]. The Vo (II), Cu (II), Zn (II), Cd (II) and Hg (II) were coordinated through azide group only forming square pyramidal
... Show MoreThe issue of image captioning, which comprises automatic text generation to understand an image’s visual information, has become feasible with the developments in object recognition and image classification. Deep learning has received much interest from the scientific community and can be very useful in real-world applications. The proposed image captioning approach involves the use of Convolution Neural Network (CNN) pre-trained models combined with Long Short Term Memory (LSTM) to generate image captions. The process includes two stages. The first stage entails training the CNN-LSTM models using baseline hyper-parameters and the second stage encompasses training CNN-LSTM models by optimizing and adjusting the hyper-parameters of
... Show MoreTwo simple methods for the determination of eugenol were developed. The first depends on the oxidative coupling of eugenol with p-amino-N,N-dimethylaniline (PADA) in the presence of K3[Fe(CN)6]. A linear regression calibration plot for eugenol was constructed at 600 nm, within a concentration range of 0.25-2.50 μg.mL–1 and a correlation coefficient (r) value of 0.9988. The limits of detection (LOD) and quantitation (LOQ) were 0.086 and 0.284 μg.mL–1, respectively. The second method is based on the dispersive liquid-liquid microextraction of the derivatized oxidative coupling product of eugenol with PADA. Under the optimized extraction procedure, the extracted colored product was determined spectrophotometrically at 618 nm. A l
... Show MoreIn this paper, we studied the scheduling of jobs on a single machine. Each of n jobs is to be processed without interruption and becomes available for processing at time zero. The objective is to find a processing order of the jobs, minimizing the sum of maximum earliness and maximum tardiness. This problem is to minimize the earliness and tardiness values, so this model is equivalent to the just-in-time production system. Our lower bound depended on the decomposition of the problem into two subprograms. We presented a novel heuristic approach to find a near-optimal solution for the problem. This approach depends on finding efficient solutions for two problems. The first problem is minimizing total completi
... Show MoreIn this paper, we deal with games of fuzzy payoffs problem while there is uncertainty in data. We use the trapezoidal membership function to transform the data into fuzzy numbers and utilize the three different ranking function algorithms. Then we compare between these three ranking algorithms by using trapezoidal fuzzy numbers for the decision maker to get the best gains
The primary purpose of this subject is to define new games in ideal spaces via set. The relationships between games that provided and the winning and losing strategy for any player were elucidated.
Scheduling Timetables for courses in the big departments in the universities is a very hard problem and is often be solved by many previous works although results are partially optimal. This work implements the principle of an evolutionary algorithm by using genetic theories to solve the timetabling problem to get a random and full optimal timetable with the ability to generate a multi-solution timetable for each stage in the collage. The major idea is to generate course timetables automatically while discovering the area of constraints to get an optimal and flexible schedule with no redundancy through the change of a viable course timetable. The main contribution in this work is indicated by increasing the flexibility of generating opti
... Show MoreA new human-based heuristic optimization method, named the Snooker-Based Optimization Algorithm (SBOA), is introduced in this study. The inspiration for this method is drawn from the traits of sales elites—those qualities every salesperson aspires to possess. Typically, salespersons strive to enhance their skills through autonomous learning or by seeking guidance from others. Furthermore, they engage in regular communication with customers to gain approval for their products or services. Building upon this concept, SBOA aims to find the optimal solution within a given search space, traversing all positions to obtain all possible values. To assesses the feasibility and effectiveness of SBOA in comparison to other algorithms, we conducte
... Show MoreCox regression model have been used to estimate proportion hazard model for patients with hepatitis disease recorded in Gastrointestinal and Hepatic diseases Hospital in Iraq for (2002 -2005). Data consists of (age, gender, survival time terminal stat). A Kaplan-Meier method has been applied to estimate survival function and hazerd function.