Churning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date. A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM scheme for categorizing employees. In 1st stage, analytic hierarchy process (AHP) has been utilized for assigning relative weights for employee accomplishment factors. In second stage, TOPSIS has been used for expressing significance of employees for performing employee categorization. A simple 20-30-50 rule in DE PARETO principle has been applied to categorize employees into three major groups namely enthusiastic, behavioral and distressed employees. Random forest algorithm is then applied as baseline algorithm to the proposed employee churn framework to predict class-wise employee churn which is tested on standard dataset of the (HRIS), the obtained results are evaluated with other ML methods. The Random Forest ML algorithm in SNEC scheme has similar or slightly better overall accuracy and MCC with significant less time complexity compared with that of ECPR scheme using CATBOOST algorithm.
Heart disease is a significant and impactful health condition that ranks as the leading cause of death in many countries. In order to aid physicians in diagnosing cardiovascular diseases, clinical datasets are available for reference. However, with the rise of big data and medical datasets, it has become increasingly challenging for medical practitioners to accurately predict heart disease due to the abundance of unrelated and redundant features that hinder computational complexity and accuracy. As such, this study aims to identify the most discriminative features within high-dimensional datasets while minimizing complexity and improving accuracy through an Extra Tree feature selection based technique. The work study assesses the efficac
... Show MoreThis abstract focuses on the significance of wireless body area networks (WBANs) as a cutting-edge and self-governing technology, which has garnered substantial attention from researchers. The central challenge faced by WBANs revolves around upholding quality of service (QoS) within rapidly evolving sectors like healthcare. The intricate task of managing diverse traffic types with limited resources further compounds this challenge. Particularly in medical WBANs, the prioritization of vital data is crucial to ensure prompt delivery of critical information. Given the stringent requirements of these systems, any data loss or delays are untenable, necessitating the implementation of intelligent algorithms. These algorithms play a pivota
... Show MoreThis research attempts to shed light on a topic that is considered one of the most important topics of HRMs management, which is the Employee centric approach by examining its philosophy and understanding . To achieve the goal, the research relied on the philosophical analytical method, which is one of the approaches used in theoretical studies. The research reached a set of conclusions, the most important of which are the theoretical studies that addressed this entry in the English language and the lack of it in the Arabic language, according to the researcher's knowledge. The research reached a set of recommendations, the most important of which was that this approach needs more research, analysis and study at the practical and th
... Show MoreIn this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on the basis of the method of the Cen
... Show MoreThe research problem consisted in answering the question that revolves around what is the impact of localizing employees’ salaries on bank deposits, and did it lead to an increase in their size? The research also aimed to nominate some initial solutions to improve the role of salary localization, in terms of reviewing the concept of electronic payment systems, its tools and channels, and then identifying the concept of salary localization, its importance, objectives and obstacles to its application, and then analyzing the reality of the state of localization of salaries and bank deposits to banks, the research sample for the period (2017- 2021), and the use of the statistical program (SPSS V25) to test the research hypotheses. The stud
... Show MoreAbstract
In this research we been estimated the survival function for data suffer from the disturbances and confusion of Iraq Household Socio-Economic Survey: IHSES II 2012 , to data from a five-year age groups follow the distribution of the Generalized Gamma: GG. It had been used two methods for the purposes of estimating and fitting which is the way the Principle of Maximizing Entropy: POME, and method of booting to nonparametric smoothing function for Kernel, to overcome the mathematical problems plaguing integrals contained in this distribution in particular of the integration of the incomplete gamma function, along with the use of traditional way in which is the Maximum Likelihood: ML. Where the comparison on t
... Show MoreIn this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show MoreIn this review of literature, the light will be concentrated on the role of stem cells as an approach in periodontal regeneration.