The need for information web-searching is needed by many users nowadays. They use the search engines to input their query or question and wait for the answer or best search results. As results to user query the search engines many times may be return irrelevant pages or not related to information need. This paper presents a proposed model to provide the user with efficient and effective result through search engine, based on modified chicken swarm algorithm and cosine similarity to eliminate and delete irrelevant pages(outliers) from the ranked list results, and to improve the results of the user's query . The proposed model is applied to Arabic dataset and use the ZAD corpus dataset for 27300 document. The experimental result shows that the proposed model improves the precision, recall, and accuracy. Thus the result produced by this method improves accuracy.
Purpose: As a result of the sudden and ill-considered trade openness of Iraq after 2003 to the countries of the world in general and the neighboring countries in particular, and in the absence of the necessary support for the national productive forces and the lack of effective standardization and quality control devices, this led to the exposure of most local products, especially agricultural ones, to decline and inability On the competition and thus dumping the Iraqi market, especially the agricultural products, with imported products, this study came to find out the effect that dumping has on the local production of chicken meat and the impact of that impact on the size of the food gap, and whether the results of the practica
... Show MoreSolving problems via artificial intelligence techniques has widely prevailed in different aspects. Implementing artificial intelligence optimization algorithms for NP-hard problems is still challenging. In this manuscript, we work on implementing the Naked Mole-Rat Algorithm (NMRA) to solve the n-queens problems and overcome the challenge of applying NMRA to a discrete space set. An improvement of NMRA is applied using the aspect of local search in the Variable Neighborhood Search algorithm (VNS) with 2-opt and 3-opt. Introducing the Naked Mole Rat algorithm based on variable neighborhood search (NMRAVNS) to solve N-queens problems with different sizes. Finding the best solution or set of solutions within a plausible amount of t
... Show MoreThis paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one
... Show MoreMost of the medical datasets suffer from missing data, due to the expense of some tests or human faults while recording these tests. This issue affects the performance of the machine learning models because the values of some features will be missing. Therefore, there is a need for a specific type of methods for imputing these missing data. In this research, the salp swarm algorithm (SSA) is used for generating and imputing the missing values in the pain in my ass (also known Pima) Indian diabetes disease (PIDD) dataset, the proposed algorithm is called (ISSA). The obtained results showed that the classification performance of three different classifiers which are support vector machine (SVM), K-nearest neighbour (KNN), and Naïve B
... Show MoreCharacteristic evolving is most serious move that deal with image discrimination. It makes the content of images as ideal as possible. Gaussian blur filter used to eliminate noise and add purity to images. Principal component analysis algorithm is a straightforward and active method to evolve feature vector and to minimize the dimensionality of data set, this paper proposed using the Gaussian blur filter to eliminate noise of images and improve the PCA for feature extraction. The traditional PCA result as total average of recall and precision are (93% ,97%) and for the improved PCA average recall and precision are (98% ,100%), this show that the improved PCA is more effective in recall and precision.
Orthogonal polynomials and their moments have significant role in image processing and computer vision field. One of the polynomials is discrete Hahn polynomials (DHaPs), which are used for compression, and feature extraction. However, when the moment order becomes high, they suffer from numerical instability. This paper proposes a fast approach for computing the high orders DHaPs. This work takes advantage of the multithread for the calculation of Hahn polynomials coefficients. To take advantage of the available processing capabilities, independent calculations are divided among threads. The research provides a distribution method to achieve a more balanced processing burden among the threads. The proposed methods are tested for va
... Show MoreThe enhancement of heat exchanger performance was investigated using dimpled tubes tested at different Reynolds numbers, in the present work four types of dimpled tubes with a specified configuration manufactured, tested and then compared performance with the smooth tube and other passive techniques performance. Two dimpled arrangements along the tube were investigated, these are inline and staggered at constant pitch ratio X/d=4, the test results showed that Nusselts number (heat transfer) of the staggered array is higher than the inline array by 13%. The effect of different depths of the dimple (14.5 mm and 18.5 mm) has been also investigated; a tube with large dimple diameter enhanced the Nusselts number by about 25% for the ran
... Show MoreThe main parameter that drives oil industry contract investment and set up economic feasibility study for approving field development plan is hydrocarbon reservoir potential. So a qualified experience should be deeply afforded to correctly evaluate hydrocarbons reserve by applying different techniques at each phase of field management, through collecting and using valid and representative data sources, starting from exploration phase and tune-up by development phase. Commonly, volumetric calculation is the main technique for estimate reservoir potential using available information at exploration stage which is quite few data; in most cases, this technique estimate big figure of reserve. In this study