A mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others in most simulation scenarios according to the integrated mean square error and integrated classification error
The problem of this research is:
What are the sustainable development goals that received the priority in the press addressing of the newspapers under study?
What are the journalistic arts adopted by these newspapers in addressing the sustainable development goals?
What are the journalistic sources that Arab newspapers depended on when addressing the sustainable development goals?
What are the geographic range the Arab newspapers adopted in addressing the sustainable development goals? The research is categorized into descriptive research, adopting the survey method, and using the content analysis method.
The sample of research was determined by the preparation of the Arabic newspapers (Al-
... Show MoreIn this paper, the reliability and scheduling of maintenance of some medical devices were estimated by one variable, the time variable (failure times) on the assumption that the time variable for all devices has the same distribution as (Weibull distribution.
The method of estimating the distribution parameters for each device was the OLS method.
The main objective of this research is to determine the optimal time for preventive maintenance of medical devices. Two methods were adopted to estimate the optimal time of preventive maintenance. The first method depends on the maintenance schedule by relying on information on the cost of maintenance and the cost of stopping work and acc
... Show MoreQuantum key distribution (QKD) provides unconditional security in theory. However, practical QKD systems face challenges in maximizing the secure key rate and extending transmission distances. In this paper, we introduce a comparative study of the BB84 protocol using coincidence detection with two different quantum channels: a free space and underwater quantum channels. A simulated seawater was used as an example for underwater quantum channel. Different single photon detection modules were used on Bob’s side to capture the coincidence counts. Results showed that increasing the mean photon number generally leads to a higher rate of coincidence detection and therefore higher possibility of increasing the secure key rate. The secure key rat
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show MoreThe comparison of double informative priors which are assumed for the reliability function of Pareto type I distribution. To estimate the reliability function of Pareto type I distribution by using Bayes estimation, will be used two different kind of information in the Bayes estimation; two different priors have been selected for the parameter of Pareto type I distribution . Assuming distribution of three double prior’s chi- gamma squared distribution, gamma - erlang distribution, and erlang- exponential distribution as double priors. The results of the derivaties of these estimators under the squared error loss function with two different double priors. Using the simulation technique, to compare the performance for
... Show MoreLocalization is an essential demand in wireless sensor networks (WSNs). It relies on several types of measurements. This paper focuses on positioning in 3-D space using time-of-arrival- (TOA-) based distance measurements between the target node and a number of anchor nodes. Central localization is assumed and either RF, acoustic or UWB signals are used for distance measurements. This problem is treated by using iterative gradient descent (GD), and an iterative GD-based algorithm for localization of moving sensors in a WSN has been proposed. To localize a node in 3-D space, at least four anchors are needed. In this work, however, five anchors are used to get better accuracy. In GD localization of a moving sensor, the algo
... Show MoreThis paper presents a hybrid approach for solving null values problem; it hybridizes rough set theory with intelligent swarm algorithm. The proposed approach is a supervised learning model. A large set of complete data called learning data is used to find the decision rule sets that then have been used in solving the incomplete data problem. The intelligent swarm algorithm is used for feature selection which represents bees algorithm as heuristic search algorithm combined with rough set theory as evaluation function. Also another feature selection algorithm called ID3 is presented, it works as statistical algorithm instead of intelligent algorithm. A comparison between those two approaches is made in their performance for null values estima
... Show MoreTwo simple methods for the determination of eugenol were developed. The first depends on the oxidative coupling of eugenol with p-amino-N,N-dimethylaniline (PADA) in the presence of K3[Fe(CN)6]. A linear regression calibration plot for eugenol was constructed at 600 nm, within a concentration range of 0.25-2.50 μg.mL–1 and a correlation coefficient (r) value of 0.9988. The limits of detection (LOD) and quantitation (LOQ) were 0.086 and 0.284 μg.mL–1, respectively. The second method is based on the dispersive liquid-liquid microextraction of the derivatized oxidative coupling product of eugenol with PADA. Under the optimized extraction procedure, the extracted colored product was determined spectrophotometrically at 618 nm. A l
... Show MoreWireless Sensor Networks (WSNs) are promoting the spread of the Internet for devices in all areas of
life, which makes it is a promising technology in the future. In the coming days, as attack technologies become
more improved, security will have an important role in WSN. Currently, quantum computers pose a significant
risk to current encryption technologies that work in tandem with intrusion detection systems because it is
difficult to implement quantum properties on sensors due to the resource limitations. In this paper, quantum
computing is used to develop a future-proof, robust, lightweight and resource-conscious approach to sensor
networks. Great emphasis is placed on the concepts of using the BB8
Aspect-based sentiment analysis is the most important research topic conducted to extract and categorize aspect-terms from online reviews. Recent efforts have shown that topic modelling is vigorously used for this task. In this paper, we integrated word embedding into collapsed Gibbs sampling in Latent Dirichlet Allocation (LDA). Specifically, the conditional distribution in the topic model is improved using the word embedding model that was trained against (customer review) training dataset. Semantic similarity (cosine measure) was leveraged to distribute the aspect-terms to their related aspect-category cognitively. The experiment was conducted to extract and categorize the aspect terms from SemEval 2014 dataset.