A mixture model is used to model data that come from more than one component. In recent years, it became an effective tool in drawing inferences about the complex data that we might come across in real life. Moreover, it can represent a tremendous confirmatory tool in classification observations based on similarities amongst them. In this paper, several mixture regression-based methods were conducted under the assumption that the data come from a finite number of components. A comparison of these methods has been made according to their results in estimating component parameters. Also, observation membership has been inferred and assessed for these methods. The results showed that the flexible mixture model outperformed the others in most simulation scenarios according to the integrated mean square error and integrated classification error
Simulation experiments are a means of solving in many fields, and it is the process of designing a model of the real system in order to follow it and identify its behavior through certain models and formulas written according to a repeating software style with a number of iterations. The aim of this study is to build a model that deals with the behavior suffering from the state of (heteroskedasticity) by studying the models (APGARCH & NAGARCH) using (Gaussian) and (Non-Gaussian) distributions for different sample sizes (500,1000,1500,2000) through the stage of time series analysis (identification , estimation, diagnostic checking and prediction). The data was generated using the estimations of the parameters resulting f
... Show MoreThis work aims to analyze a three-dimensional discrete-time biological system, a prey-predator model with a constant harvesting amount. The stage structure lies in the predator species. This analysis is done by finding all possible equilibria and investigating their stability. In order to get an optimal harvesting strategy, we suppose that harvesting is to be a non-constant rate. Finally, numerical simulations are given to confirm the outcome of mathematical analysis.
KE Sharquie, JR Al-Rawi, AA Noaimi, RA Al-Khammasi, Iraqi Journal of Community Medicine, 2018
The challenge to incorporate usability evaluation values and practices into agile development process is not only persisting but also systemic. Notable contributions of researchers have attempted to isolate and close the gaps between both fields, with the aim of developing usable software. Due to the current absence of a reference model that specifies where and how usability activities need to be considered in the agile development process. This paper proposes a model for identifying appropriate usability evaluation methods alongside the agile development process. By using this model, the development team can apply usability evaluations at the right time at the right place to get the necessary feedback from the end-user. Verificatio
... Show MoreA roundabout is a highway engineering concept meant to calm traffic, increase safety, reduce stop-and-go travel, reduce accidents and congestion, and decrease traffic delays. It is circular and facilitates one-way traffic flow around a central point. The first part of this study evaluated the principles and methods used to compare the capacity methods of roundabouts with different traffic conditions and geometric configurations. These methods include gap acceptance, empirical, and simulation software methods. Previous studies mentioned in this research used various methods and other new models developed by several researchers. However, this paper's main aim is to compare different roundabout capacity models for acceptabl
... Show MoreThis study explores the challenges in Artificial Intelligence (AI) systems in generating image captions, a task that requires effective integration of computer vision and natural language processing techniques. A comparative analysis between traditional approaches such as retrieval- based methods and linguistic templates) and modern approaches based on deep learning such as encoder-decoder models, attention mechanisms, and transformers). Theoretical results show that modern models perform better for the accuracy and the ability to generate more complex descriptions, while traditional methods outperform speed and simplicity. The paper proposes a hybrid framework that combines the advantages of both approaches, where conventional methods prod
... Show MoreIt is generally accepted that there are two spectrophotometric techniques for quantifying ceftazidime (CFT) in bulk medications and pharmaceutical formulations. The methods are described as simple, sensitive, selective, accurate and efficient techniques. The first method used an alkaline medium to convert ceftazidime to its diazonium salt, which is then combined with the 1-Naphthol (1-NPT) and 2-Naphthol (2-NPT) reagents. The azo dye that was produced brown and red in color with absorption intensities of ƛmax 585 and 545nm respectively. Beer's law was followed in terms of concentration ranging from (3-40) µg .ml-1 For (CFT-1-NPT) and (CFT-2-NPT), the detection limits were 1.0096 and 0.8017 µg.ml-1, respec
... Show MoreThe current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show MoreKE Sharquie, HR Al-Hamamy, AA Noaimi, AF Tahir, Journal of Cosmetics, Dermatological Sciences and Applications, 2012 - Cited by 2