Churning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date. A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM scheme for categorizing employees. In 1st stage, analytic hierarchy process (AHP) has been utilized for assigning relative weights for employee accomplishment factors. In second stage, TOPSIS has been used for expressing significance of employees for performing employee categorization. A simple 20-30-50 rule in DE PARETO principle has been applied to categorize employees into three major groups namely enthusiastic, behavioral and distressed employees. Random forest algorithm is then applied as baseline algorithm to the proposed employee churn framework to predict class-wise employee churn which is tested on standard dataset of the (HRIS), the obtained results are evaluated with other ML methods. The Random Forest ML algorithm in SNEC scheme has similar or slightly better overall accuracy and MCC with significant less time complexity compared with that of ECPR scheme using CATBOOST algorithm.
Home New Trends in Information and Communications Technology Applications Conference paper Audio Compression Using Transform Coding with LZW and Double Shift Coding Zainab J. Ahmed & Loay E. George Conference paper First Online: 11 January 2022 126 Accesses Part of the Communications in Computer and Information Science book series (CCIS,volume 1511) Abstract The need for audio compression is still a vital issue, because of its significance in reducing the data size of one of the most common digital media that is exchanged between distant parties. In this paper, the efficiencies of two audio compression modules were investigated; the first module is based on discrete cosine transform and the second module is based on discrete wavelet tr
... Show MoreText based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreDue to the large population of motorway users in the country of Iraq, various approaches have been adopted to manage queues such as implementation of traffic lights, avoidance of illegal parking, amongst others. However, defaulters are recorded daily, hence the need to develop a mean of identifying these defaulters and bring them to book. This article discusses the development of an approach of recognizing Iraqi licence plates such that defaulters of queue management systems are identified. Multiple agencies worldwide have quickly and widely adopted the recognition of a vehicle license plate technology to expand their ability in investigative and security matters. License plate helps detect the vehicle's information automatically ra
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show MorePure SnSe thin film and doped with S at different percentage (0,3,5,7)% were deposited from alloy by thermal evaporation technique on glass substrate at room temperature with 400±20nm thickness .The influences of S dopant ratio on characterization of SnSe thin film Nano crystalline was investigated by using Atomic force microscopy(AFM), X-ray diffraction (XRD), energy dispersive spectroscopy (EDS), Hall Effect measurement, UV-Vis absorption spectroscopy to study morphological, structural, electrical and optical properties respectively .The XRD showed that all the films have polycrystalline in nature with orthorhombic structure, with preferred orientation along (111)plane .These films was manufactured of very fine crystalline size in the ra
... Show MoreRemoval of solar brown and direct black dyes by coagulation with two aluminum based
coagulants was conducted. The main objective is to examine the efficiency of these
coagulants in the treatment of dye polluted water discharged from Al-Kadhymia Textile
Company (Baghdad-Iraq). The performance of these coagulants was investigated through
jar test by comparing dye percent removal at different wastewater pH, coagulant dose,
and initial dye concentration. Results show that alum works better than PAC under acidic
media (5-6) and PAC works better under basic media (7-8) in the removal of both solar
brown and direct black dyes. Higher doses of PAC were required to achieve the
maximum removal efficiency under optimum pH co
In this paper, The transfer function model in the time series was estimated using different methods, including parametric Represented by the method of the Conditional Likelihood Function, as well as the use of abilities nonparametric are in two methods local linear regression and cubic smoothing spline method, This research aims to compare those capabilities with the nonlinear transfer function model by using the style of simulation and the study of two models as output variable and one model as input variable in addition t
... Show MoreThe denoising of a natural image corrupted by Gaussian noise is a problem in signal or image processing. Much work has been done in the field of wavelet thresholding but most of it was focused on statistical modeling of wavelet coefficients and the optimal choice of thresholds. This paper describes a new method for the suppression of noise in image by fusing the stationary wavelet denoising technique with adaptive wiener filter. The wiener filter is applied to the reconstructed image for the approximation coefficients only, while the thresholding technique is applied to the details coefficients of the transform, then get the final denoised image is obtained by combining the two results. The proposed method was applied by usin
... Show MoreImage quality plays a vital role in improving and assessing image compression performance. Image compression represents big image data to a new image with a smaller size suitable for storage and transmission. This paper aims to evaluate the implementation of the hybrid techniques-based tensor product mixed transform. Compression and quality metrics such as compression-ratio (CR), rate-distortion (RD), peak signal-to-noise ratio (PSNR), and Structural Content (SC) are utilized for evaluating the hybrid techniques. Then, a comparison between techniques is achieved according to these metrics to estimate the best technique. The main contribution is to improve the hybrid techniques. The proposed hybrid techniques are consisting of discrete wavel
... Show More