The need for an efficient method to find the furthermost appropriate document corresponding to a particular search query has become crucial due to the exponential development in the number of papers that are now readily available to us on the web. The vector space model (VSM) a perfect model used in “information retrieval”, represents these words as a vector in space and gives them weights via a popular weighting method known as term frequency inverse document frequency (TF-IDF). In this research, work has been proposed to retrieve the most relevant document focused on representing documents and queries as vectors comprising average term term frequency inverse sentence frequency (TF-ISF) weights instead of representing them as vectors of term TF-IDF weight and two basic and effective similarity measures: Cosine and Jaccard were used. Using the MS MARCO dataset, this article analyzes and assesses the retrieval effectiveness of the TF-ISF weighting scheme. The result shows that the TF-ISF model with the Cosine similarity measure retrieves more relevant documents. The model was evaluated against the conventional TF-ISF technique and shows that it performs significantly better on MS MARCO data (Microsoft-curated data of Bing queries).
The rise in the general level of prices in Iraq makes the local commodity less able to compete with other commodities, which leads to an increase in the amount of imports and a decrease in the amount of exports, since it raises demand for foreign currencies while decreasing demand for the local currency, which leads to a decrease in the exchange rate of the local currency in exchange for an increase in the exchange rate of currencies. This is one of the most important factors affecting the determination of the exchange rate and its fluctuations. This research deals with the currency of the European Euro and its impact against the Iraqi dinar. To make an accurate prediction for any process, modern methods can be used through which
... Show MoreIn this paper a decoder of binary BCH code is implemented using a PIC microcontroller for code length n=127 bits with multiple error correction capability, the results are presented for correcting errors up to 13 errors. The Berkelam-Massey decoding algorithm was chosen for its efficiency. The microcontroller PIC18f45k22 was chosen for the implementation and programmed using assembly language to achieve highest performance. This makes the BCH decoder implementable as a low cost module that can be used as a part of larger systems. The performance evaluation is presented in terms of total number of instructions and the bit rate.
In this paper, the reliability and scheduling of maintenance of some medical devices were estimated by one variable, the time variable (failure times) on the assumption that the time variable for all devices has the same distribution as (Weibull distribution.
The method of estimating the distribution parameters for each device was the OLS method.
The main objective of this research is to determine the optimal time for preventive maintenance of medical devices. Two methods were adopted to estimate the optimal time of preventive maintenance. The first method depends on the maintenance schedule by relying on information on the cost of maintenance and the cost of stopping work and acc
... Show MoreThe main objective of this work is to propose a new routing protocol for wireless sensor network employed to serve IoT systems. The routing protocol has to adapt with different requirements in order to enhance the performance of IoT applications. The link quality, node depth and energy are used as metrics to make routing decisions. Comparison with other protocols is essential to show the improvements achieved by this work, thus protocols designed to serve the same purpose such as AODV, REL and LABILE are chosen to compare the proposed routing protocol with. To add integrative and holistic, some of important features are added and tested such as actuating and mobility. These features are greatly required by some of IoT applications and im
... Show MoreThe flow measurements have increased importance in the last decades due to the shortage of water resources resulting from climate changes that request high control of the available water needed for different uses. The classical technique of open channel flow measurement by the integrating-float method was needed for measuring flow in different locations when there were no available modern devices for different reasons, such as the cost of devices. So, the use of classical techniques was taken place to solve the problem. The present study examines the integrating float method and defines the parameters affecting the acceleration of floating spheres in flowing water that was analyzed using experimental measurements. The me
... Show MoreIn this paper, we deal with games of fuzzy payoffs problem while there is uncertainty in data. We use the trapezoidal membership function to transform the data into fuzzy numbers and utilize the three different ranking function algorithms. Then we compare between these three ranking algorithms by using trapezoidal fuzzy numbers for the decision maker to get the best gains
the banks are one of the public services that must be available in the city to ensure easy financial dealings between citizens and state departments and between the state departments with each other and between the citizens themselves and to ensure easy access to it, so it is very important to choose the best location for the bank, which can serve the largest number of The population achieves easy access. Due to the difficulty of obtaining accurate information dealing with the exact coordinates and according to the country's specific projection, the researcher will resort to the default work using some of the files available in the arcview program
Groupwise non-rigid image alignment is a difficult non-linear optimization problem involving many parameters and often large datasets. Previous methods have explored various metrics and optimization strategies. Good results have been previously achieved with simple metrics, requiring complex optimization, often with many unintuitive parameters that require careful tuning for each dataset. In this chapter, the problem is restructured to use a simpler, iterative optimization algorithm, with very few free parameters. The warps are refined using an iterative Levenberg-Marquardt minimization to the mean, based on updating the locations of a small number of points and incorporating a stiffness constraint. This optimization approach is eff
... Show MoreThis paper discusses the problem of decoding codeword in Reed- Muller Codes. We will use the Hadamard matrices as a method to decode codeword in Reed- Muller codes.In addition Reed- Muller Codes are defined and encoding matrices are discussed. Finally, a method of decoding is explained and an example is given to clarify this method, as well as, this method is compared with the classical method which is called Hamming distance.