A mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others in most simulation scenarios according to the integrated mean square error and integrated classification error
The Rivest–Shamir–Adleman (RSA) and the Diffie-Hellman (DH) key exchange are famous methods for encryption. These methods depended on selecting the primes p and q in order to be secure enough . This paper shows that the named methods used the primes which are found by some arithmetical function .In the other sense, no need to think about getting primes p and q and how they are secure enough, since the arithmetical function enable to build the primes in such complicated way to be secure. Moreover, this article gives new construction of the RSA algorithm and DH key exchange using the
primes p,qfrom areal number x.
This paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show MoreLocalization is an essential demand in wireless sensor networks (WSNs). It relies on several types of measurements. This paper focuses on positioning in 3-D space using time-of-arrival- (TOA-) based distance measurements between the target node and a number of anchor nodes. Central localization is assumed and either RF, acoustic or UWB signals are used for distance measurements. This problem is treated by using iterative gradient descent (GD), and an iterative GD-based algorithm for localization of moving sensors in a WSN has been proposed. To localize a node in 3-D space, at least four anchors are needed. In this work, however, five anchors are used to get better accuracy. In GD localization of a moving sensor, the algo
... Show MoreIn this paper, the reliability and scheduling of maintenance of some medical devices were estimated by one variable, the time variable (failure times) on the assumption that the time variable for all devices has the same distribution as (Weibull distribution.
The method of estimating the distribution parameters for each device was the OLS method.
The main objective of this research is to determine the optimal time for preventive maintenance of medical devices. Two methods were adopted to estimate the optimal time of preventive maintenance. The first method depends on the maintenance schedule by relying on information on the cost of maintenance and the cost of stopping work and acc
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreQuantum key distribution (QKD) provides unconditional security in theory. However, practical QKD systems face challenges in maximizing the secure key rate and extending transmission distances. In this paper, we introduce a comparative study of the BB84 protocol using coincidence detection with two different quantum channels: a free space and underwater quantum channels. A simulated seawater was used as an example for underwater quantum channel. Different single photon detection modules were used on Bob’s side to capture the coincidence counts. Results showed that increasing the mean photon number generally leads to a higher rate of coincidence detection and therefore higher possibility of increasing the secure key rate. The secure key rat
... Show MoreHydrogen fuel is a good alternative to fossil fuels. It can be produced using a clean energy without contaminated emissions. This work is concerned with experimental study on hydrogen production via solar energy. Photovoltaic module is used to convert solar radiation to electrical energy. The electrical energy is used for electrolysis of water into hydrogen and oxygen by using alkaline water electrolyzer with stainless steel electrodes. A MATLAB computer program is developed to solve a four-parameter-model and predict the characteristics of PV module under Baghdad climate conditions. The hydrogen production system is tested at different NaOH mass concentration of (50,100, 200, 300) gram. The maximum hydrogen produc
... Show MoreInterval methods for verified integration of initial value problems (IVPs) for ODEs have been used for more than 40 years. For many classes of IVPs, these methods have the ability to compute guaranteed error bounds for the flow of an ODE, where traditional methods provide only approximations to a solution. Overestimation, however, is a potential drawback of verified methods. For some problems, the computed error bounds become overly pessimistic, or integration even breaks down. The dependency problem and the wrapping effect are particular sources of overestimations in interval computations. Berz (see [1]) and his co-workers have developed Taylor model methods, which extend interval arithmetic with symbolic computations. The latter is an ef
... Show MoreFacial recognition has been an active field of imaging science. With the recent progresses in computer vision development, it is extensively applied in various areas, especially in law enforcement and security. Human face is a viable biometric that could be effectively used in both identification and verification. Thus far, regardless of a facial model and relevant metrics employed, its main shortcoming is that it requires a facial image, against which comparison is made. Therefore, closed circuit televisions and a facial database are always needed in an operational system. For the last few decades, unfortunately, we have experienced an emergence of asymmetric warfare, where acts of terrorism are often committed in secluded area with no
... Show MoreThe nuclear level density parameter in non Equi-Spacing Model (NON-ESM), Equi-Spacing Model (ESM) and the Backshifted Energy Dependent Fermi Gas model (BSEDFG) was determined for 106 nuclei; the results are tabulated and compared with the experimental works. It was found that there are no recognizable differences between our results and the experimental -values. The calculated level density parameters have been used in computing the state density as a function of the excitation energies for 58Fe and 246Cm nuclei. The results are in a good agreement with the experimental results from earlier published work.