In this study, a fast block matching search algorithm based on blocks' descriptors and multilevel blocks filtering is introduced. The used descriptors are the mean and a set of centralized low order moments. Hierarchal filtering and MAE similarity measure were adopted to nominate the best similar blocks lay within the pool of neighbor blocks. As next step to blocks nomination the similarity of the mean and moments is used to classify the nominated blocks and put them in one of three sub-pools, each one represents certain nomination priority level (i.e., most, less & least level). The main reason of the introducing nomination and classification steps is a significant reduction in the number of matching instances of the pixels belong to the compared blocks is achieved. Instead of pixels-wise comparisons a set of hierarchal similarity comparisons between few descriptors of the compared blocks is done. The computations of blocks descriptors have linear complexity, O(n) and small number of involved similarity comparisons is required. As final stage, the selected blocks as the best similar blocks according to their descriptors are only pushed to pixel-wise blocks comparison stage. The performance of the proposed system was tested for both cases: (i) without using prediction for assessing the initial motion vector and (ii) with using prediction that based on the determined motion vectors of already scanned neighbor blocks. The test results indicated that the introduced method for both cases (without/ with prediction) can lead to promising results in terms of time and error level; because there is reduction in search time and error level parameters in comparison with exhaustive search and three step search (TSS) algorithms.
Solid waste is a major issue in today's world. Which can be a contributing factor to pollution and the spread of vector-borne diseases. Because of its complicated nonlinear processes, this problem is difficult to model and optimize using traditional methods. In this study, a mathematical model was developed to optimize the cost of solid waste recycling and management. In the optimization phase, the salp swarm algorithm (SSA) is utilized to determine the level of discarded solid waste and reclaimed solid waste. An optimization technique SSA is a new method of finding the ideal solution for a mathematical relationship based on leaders and followers. It takes a lot of random solutions, as well as their outward or inward fluctuations, t
... Show MoreRegression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show MoreIn petroleum reservoir engineering, history matching refers to the calibration process in which a reservoir simulation model is validated through matching simulation outputs with the measurement of observed data. A traditional history matching technique is performed manually by engineering in which the most uncertain observed parameters are changed until a satisfactory match is obtained between the generated model and historical information. This study focuses on step by step and trial and error history matching of the Mishrif reservoir to constrain the appropriate simulated model. Up to 1 January 2021, Buzurgan Oilfield, which has eighty-five producers and sixteen injectors and has been under production for 45 years when it started
... Show MoreThe demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB
... Show MoreIn this article, a numerical method integrated with statistical data simulation technique is introduced to solve a nonlinear system of ordinary differential equations with multiple random variable coefficients. The utilization of Monte Carlo simulation with central divided difference formula of finite difference (FD) method is repeated n times to simulate values of the variable coefficients as random sampling instead being limited as real values with respect to time. The mean of the n final solutions via this integrated technique, named in short as mean Monte Carlo finite difference (MMCFD) method, represents the final solution of the system. This method is proposed for the first time to calculate the numerical solution obtained fo
... Show More