Governmental establishments are maintaining historical data for job applicants for future analysis of predication, improvement of benefits, profits, and development of organizations and institutions. In e-government, a decision can be made about job seekers after mining in their information that will lead to a beneficial insight. This paper proposes the development and implementation of an applicant's appropriate job prediction system to suit his or her skills using web content classification algorithms (Logit Boost, j48, PART, Hoeffding Tree, Naive Bayes). Furthermore, the results of the classification algorithms are compared based on data sets called "job classification data" sets. Experimental results indicated that the algorithm j48 had the highest precision (94.80%) compared to other algorithms for the aforementioned dataset.
The objective of the present work was to estimate water requirements and water use efficiency for Broccoli under normal irrigation conditions and sewage irrigation. Field experiment was carried out during the season 2018 at station/Sulaimni agricultural station/Bakrajo –College of Agricultural Sciences. The experiment included three treatments: River water irrigation in all season growth (I1), Sewage water irrigation in all season growth (I2), Alternate irrigation (one river irrigation followed by two sewage water irrigation) in all season growth (I3). The experimental Design was Randomized Complete Block Design (RCBD) w
The aim of this paper is to derive a posteriori error estimates for semilinear parabolic interface problems. More specifically, optimal order a posteriori error analysis in the - norm for semidiscrete semilinear parabolic interface problems is derived by using elliptic reconstruction technique introduced by Makridakis and Nochetto in (2003). A key idea for this technique is the use of error estimators derived for elliptic interface problems to obtain parabolic estimators that are of optimal order in space and time.
In this paper we investigate the use of two types of local search methods (LSM), the Simulated Annealing (SA) and Particle Swarm Optimization (PSO), to solve the problems ( ) and . The results of the two LSMs are compared with the Branch and Bound method and good heuristic methods. This work shows the good performance of SA and PSO compared with the exact and heuristic methods in terms of best solutions and CPU time.
Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreGrabisch and Labreuche have recently proposed a generalization of capacities, called the bi-capacities. Recently, a new approach for studying bi-capacities through introducing a notion of ternary-element sets proposed by the author. In this paper, we propose many results such as bipolar Mobius transform, importance index, and interaction index of bi-capacities based on our approach.
Currently, the prominence of automatic multi document summarization task belongs to the information rapid increasing on the Internet. Automatic document summarization technology is progressing and may offer a solution to the problem of information overload.
Automatic text summarization system has the challenge of producing a high quality summary. In this study, the design of generic text summarization model based on sentence extraction has been redirected into a more semantic measure reflecting individually the two significant objectives: content coverage and diversity when generating summaries from multiple documents as an explicit optimization model. The proposed two models have been then coupled and def
... Show MoreAkaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
A substantial matter to confidential messages' interchange through the internet is transmission of information safely. For example, digital products' consumers and producers are keen for knowing those products are genuine and must be distinguished from worthless products. Encryption's science can be defined as the technique to embed the data in an images file, audio or videos in a style which should be met the safety requirements. Steganography is a portion of data concealment science that aiming to be reached a coveted security scale in the interchange of private not clear commercial and military data. This research offers a novel technique for steganography based on hiding data inside the clusters that resulted from fuzzy clustering. T
... Show More