Churning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date. A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM scheme for categorizing employees. In 1st stage, analytic hierarchy process (AHP) has been utilized for assigning relative weights for employee accomplishment factors. In second stage, TOPSIS has been used for expressing significance of employees for performing employee categorization. A simple 20-30-50 rule in DE PARETO principle has been applied to categorize employees into three major groups namely enthusiastic, behavioral and distressed employees. Random forest algorithm is then applied as baseline algorithm to the proposed employee churn framework to predict class-wise employee churn which is tested on standard dataset of the (HRIS), the obtained results are evaluated with other ML methods. The Random Forest ML algorithm in SNEC scheme has similar or slightly better overall accuracy and MCC with significant less time complexity compared with that of ECPR scheme using CATBOOST algorithm.
In the light of the globalization Which surrounds the business environment and whose impact has been reflected on industrial economic units the whole world has become a single market that affects its variables on all units and is affected by the economic contribution of each economic unit as much as its share. The problem of this research is that the use of Pareto analysis enables industrial economic units to diagnose the risks surrounding them , so the main objective of the research was to classify risks into both internal and external types and identify any risks that require more attention.
The research was based on the hypothesis that Pareto analysis used, risks can be identified and addressed before they occur.
... Show MoreEthnographic research is perhaps the most common applicable type of qualitative research method in psychology and medicine. In ethnography studies, the researcher immerses himself in the environment of participants to understand the cultures, challenges, motivations, and topics that arise between them by investigating the environment directly. This type of research method can last for a few days to a few years because it involves in-depth monitoring and data collection based on these foundations. For this reason, the findings of the current study stimuli the researchers in psychology and medicine to conduct studies by applying ethnographic research method to investigate the common cultural patterns language, thinking, beliefs, and behavior
... Show MoreEach Intensity Modulated Radiation Therapy (IMRT) plan needs to be tested and verified before any treatment to check its quality. Octavius 4D-1500 phantom detector is a modern and qualified device for quality assurance procedure. This study aims to compare the common dosimetric criteria 3%/3 mm with 2%/2 mm for H&N plans for the IMRT technique. Twenty-five patients with head and neck (H&N) tumor were with 6MV x-ray photon beam using Monaco 5.1 treatment planning software and exported to Elekta synergy linear accelerator then tested for pretreatment verification study using Octavius 4D-1500 phantom detector. The difference between planned and measured dose were assessed by using local and global gamma index (GI) analysis method at
... Show MoreHeart diseases are diverse, common, and dangerous diseases that affect the heart's function. They appear as a result of genetic factors or unhealthy practices. Furthermore, they are the leading cause of mortalities in the world. Cardiovascular diseases seriously concern the health and activity of the heart by narrowing the arteries and reducing the amount of blood received by the heart, which leads to high blood pressure and high cholesterol. In addition, healthcare workers and physicians need intelligent technologies that help them analyze and predict based on patients’ data for early detection of heart diseases to find the appropriate treatment for them because these diseases appear on the patient without pain or noticeable symptoms,
... Show MoreIn this paper, for the first time we introduce a new four-parameter model called the Gumbel- Pareto distribution by using the T-X method. We obtain some of its mathematical properties. Some structural properties of the new distribution are studied. The method of maximum likelihood is used for estimating the model parameters. Numerical illustration and an application to a real data set are given to show the flexibility and potentiality of the new model.
This study seeks to highlights on the behavioral approach in organization theory as modern and effective entrance in constructing this theory and reflection extent on the behavior of both the product and the information user (accountant and financial information).
The study also focus on behavioral approach role in consolidating accounting concepts through making harmony between them so that the accountant can influence the user behavior with the concepts and principles of accounting in an effort to provide quality characteristic of accounting information produced by him in consistent with his behavior and information user and its impact on the decision making process by the latter.
... Show MoreOpenStreetMap (OSM), recognised for its current and readily accessible spatial database, frequently serves regions lacking precise data at the necessary granularity. Global collaboration among OSM contributors presents challenges to data quality and uniformity, exacerbated by the sheer volume of input and indistinct data annotation protocols. This study presents a methodological improvement in the spatial accuracy of OSM datasets centred over Baghdad, Iraq, utilising data derived from OSM services and satellite imagery. An analytical focus was placed on two geometric correction methods: a two-dimensional polynomial affine transformation and a two-dimensional polynomial conformal transformation. The former involves twelve coefficients for ad
... Show MoreObjective(s): This study was conducted to deal with the importance and effect of various variables which might
have influence in hydrocephaly occurrence.
Methodology: A retrospective design was performed and continued for 4 months. It included 89 nonrandomized
consecutive samples collected from the Early Detection of Childhood Disabilities Center (E.D.C.D.C.)
Duhok. The population involved was the entire cases of both sexes that attended the centre during the period from
1
st.Jan, 1998 to 30th. Dec. 2008 with final diagnosis of hydrocephaly. Patients’ records from the centre were used to
collect data.
Results: Hydrocephaly has been recognized as a public health problem in Duhok province, Iraqi Kurdistan region,<
Tremendous efforts have been exerted to understand first language acquisition to facilitate second language learning. The problem lies in the difficulty of mastering English language and adapting a theory that helps in overcoming the difficulties facing students. This study aims to apply Thomasello's theory of language mastery through usage. It assumes that adults can learn faster than children and can learn the language separately, and far from academic education. Tomasello (2003) studied the stages of language acquisition for children, and developed his theory accordingly. Some studies, such as: (Ghalebi and Sadighi, 2015, Arvidsson, 2019; Munoz, 2019; Verspoor and Hong, 2013) used this theory when examining language acquisition. Thus,
... Show MoreThis research introduce a study with application on Principal Component Regression obtained from some of the explainatory variables to limitate Multicollinearity problem among these variables and gain staibilty in their estimations more than those which yield from Ordinary Least Squares. But the cost that we pay in the other hand losing a little power of the estimation of the predictive regression function in explaining the essential variations. A suggested numerical formula has been proposed and applied by the researchers as optimal solution, and vererifing the its efficiency by a program written by the researchers themselves for this porpuse through some creterions: Cumulative Percentage Variance, Coefficient of Determination, Variance
... Show More