Information and communication technology has a significant influence on employee procedures. Businesses are investing in e-CRM technologies, yet it is difficult to assess the performance of their e-CRM platforms. The DeLone and McLean Information Systems Success framework can be modified to the current e-CRM assessment difficulties. The new framework's different aspects provide a concise framework for organizing the e-CRM key metrics identified in this study. The purpose of this study is to apply and verify that the Updated DeLone and McLean IS Model can be employed to explain e-CRM adoption among employees, along with the extended Updated DeLone and McLean Model with its five output factors, namely system quality, service quality, information quality, ease of use employee satisfaction. For this study, data was collected from 300 employees working on e-CRM and the data were analyzed using PLS-SEM. The experimental framework has a significant effect and shows that most of the hypotheses of the study are supported. Moreover, the framework contributes to the area of the success of e-CRM and individual performance.
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Most of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreElectrocoagulation is an electrochemical method for treatment of different types of wastewater whereby sacrificial anodes corrode to release active coagulant (usually aluminium or iron cations) into solution, while simultaneous evolution of hydrogen at the cathode allows for pollutant removal by flotation or settling. The Taguchi method was applied as an experimental design and to determine the best conditions for chromium (VI) removal from wastewater. Various parameters in a batch stirred tank by iron metal electrodes: pH, initial chromium concentration, current density, distance between electrodes and KCl concentration were investigated, and the results have been analyzed using signal-to-noise (S/N) ratio. It was found that the r
... Show MoreThe current study uses the flame fragment deposition (FFD) method to synthesize carbon nanotubes (CNTs) from Iraqi liquefied petroleum gas (LPG), which is used as a carbon source. To carry out the synthesis steps, a homemade reactor was used. To eliminate amorphous impurities, the CNTs were sonicated in a 30 percent hydrogen peroxide (H2O2) solution at ambient temperature. To remove the polycyclic aromatic hydrocarbons (PAHs) generated during LPG combustion, sonication in an acetone bath is used. The produced products were investigated and compared with standard Multi-walled carbon nanotube MWCNTs (95%), Sigma, Aldrich, using X-ray diffraction (XRD), thermo gravimetric analysis (TGA), Raman spectroscopy, scanning el
... Show MoreIn this paper, the goal of proposed method is to protect data against different types of attacks by unauthorized parties. The basic idea of proposed method is generating a private key from a specific features of digital color image such as color (Red, Green and Blue); the generating process of private key from colors of digital color image performed via the computing process of color frequencies for blue color of an image then computing the maximum frequency of blue color, multiplying it by its number and adding process will performed to produce a generated key. After that the private key is generated, must be converting it into the binary representation form. The generated key is extracted from blue color of keyed image then we selects a c
... Show MoreThis paper focuses on developing a self-starting numerical approach that can be used for direct integration of higher-order initial value problems of Ordinary Differential Equations. The method is derived from power series approximation with the resulting equations discretized at the selected grid and off-grid points. The method is applied in a block-by-block approach as a numerical integrator of higher-order initial value problems. The basic properties of the block method are investigated to authenticate its performance and then implemented with some tested experiments to validate the accuracy and convergence of the method.
The present work aims to study the effect of using an automatic thresholding technique to convert the features edges of the images to binary images in order to split the object from its background, where the features edges of the sampled images obtained from first-order edge detection operators (Roberts, Prewitt and Sobel) and second-order edge detection operators (Laplacian operators). The optimum automatic threshold are calculated using fast Otsu method. The study is applied on a personal image (Roben) and a satellite image to study the compatibility of this procedure with two different kinds of images. The obtained results are discussed.
The linear segment with parabolic blend (LSPB) trajectory deviates from the specified waypoints. It is restricted to that the acceleration must be sufficiently high. In this work, it is proposed to engage modified LSPB trajectory with particle swarm optimization (PSO) so as to create through points on the trajectory. The assumption of normal LSPB method that parabolic part is centered in time around waypoints is replaced by proposed coefficients for calculating the time duration of the linear part. These coefficients are functions of velocities between through points. The velocities are obtained by PSO so as to force the LSPB trajectory passing exactly through the specified path points. Also, relations for velocity correction and exact v
... Show More