The need for an efficient method to find the furthermost appropriate document corresponding to a particular search query has become crucial due to the exponential development in the number of papers that are now readily available to us on the web. The vector space model (VSM) a perfect model used in “information retrieval”, represents these words as a vector in space and gives them weights via a popular weighting method known as term frequency inverse document frequency (TF-IDF). In this research, work has been proposed to retrieve the most relevant document focused on representing documents and queries as vectors comprising average term term frequency inverse sentence frequency (TF-ISF) weights instead of representing them as vectors of term TF-IDF weight and two basic and effective similarity measures: Cosine and Jaccard were used. Using the MS MARCO dataset, this article analyzes and assesses the retrieval effectiveness of the TF-ISF weighting scheme. The result shows that the TF-ISF model with the Cosine similarity measure retrieves more relevant documents. The model was evaluated against the conventional TF-ISF technique and shows that it performs significantly better on MS MARCO data (Microsoft-curated data of Bing queries).
The importance of topology as a tool in preference theory is what motivates this study in which we characterize topologies generating by digraphs. In this paper, we generalized the notions of rough set concepts using two topological structures generated by out (resp. in)-degree sets of vertices on general digraph. New types of topological rough sets are initiated and studied using new types of topological sets. Some properties of topological rough approximations are studied by many propositions.
The study explores the use of ergative verbs in constructing clauses and their impact on the backgrounding of the agent's role in two selected short stories. Contrary to hypothesis No. 1, the research indicates that changes in sentence patterns don't affect the meaning of the process. Additionally, hypothesis No. 2 is refuted as the middle structure is found to highlight the agent's role in the science fiction short story, Terra Infirmum, rather than concealing it as hypothesized for "The Invisible Man." The analysis uncovers that writers utilize ergative processes to narrate stories in various ways, including transitive/active voice, intransitive/active voice, and transitive/passive voice. Furthermore, the findings suggest that writers emp
... Show MoreThis study applies a discourse analysis framework to explore the portrayal of women in Maysloon Hadi’s novel (The Black Eyes) (2011), using Critical Discourse Analysis (CDA) and Norman Fairclough’s tri-dimensional model (1989) as the analytical foundation. It investigates the roles and challenges women face in the novel. While there is growing interest in the portrayal of women in literature, Iraqi literature—especially from the perspective of Iraqi women writers remains underexplored. Hadi’s *The Black Eyes* provides a unique case to examine this intersection. Despite the novel’s rich narrative, which offers insight into Iraqi women’s lives, there is a lack of comprehensive CDA to understand how its language constructs
... Show MoreIn this paper we study necessary and sufficient conditions for a reverse- centralizer of a semiprime ring R to be orthogonal. We also prove that a reverse- centralizer T of a semiprime ring R having a commuting generalized inverse is orthogonal
In this study, an unknown force function dependent on the space in the wave equation is investigated. Numerically wave equation splitting in two parts, part one using the finite-difference method (FDM). Part two using separating variables method. This is the continuation and changing technique for solving inverse problem part in (1,2). Instead, the boundary element method (BEM) in (1,2), the finite-difference method (FDM) has applied. Boundary data are in the role of overdetermination data. The second part of the problem is inverse and ill-posed, since small errors in the extra boundary data cause errors in the force solution. Zeroth order of Tikhonov regularization, and several parameters of regularization are employed to decrease error
... Show MoreThis research discusses application Artificial Neural Network (ANN) and Geographical InformationSystem (GIS) models on water quality of Diyala River using Water Quality Index (WQI). Fourteen water parameterswere used for estimating WQI: pH, Temperature, Dissolved Oxygen, Orthophosphate, Nitrate, Calcium, Magnesium,Total Hardness, Sodium, Sulphate, Chloride, Total Dissolved Solids, Electrical Conductivity and Total Alkalinity.These parameters were provided from the Water Resources Ministryfrom seven stations along the river for the period2011 to 2016. The results of WQI analysis revealed that Diyala River is good to poor at the north of Diyala provincewhile it is poor to very polluted at the south of Baghdad City. The selected parameters wer
... Show MoreIn this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process, where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used.
Experimental results shows LPG-
... Show MoreThis paper includes a comparison between denoising techniques by using statistical approach, principal component analysis with local pixel grouping (PCA-LPG), this procedure is iterated second time to further improve the denoising performance, and other enhancement filters were used. Like adaptive Wiener low pass-filter to a grayscale image that has been degraded by constant power additive noise, based on statistics estimated from a local neighborhood of each pixel. Performs Median filter of the input noisy image, each output pixel contains the Median value in the M-by-N neighborhood around the corresponding pixel in the input image, Gaussian low pass-filter and Order-statistic filter also be used. Experimental results shows LPG-PCA method
... Show More