Registration techniques are still considered challenging tasks to remote sensing users, especially after enormous increase in the volume of remotely sensed data being acquired by an ever-growing number of earth observation sensors. This surge in use mandates the development of accurate and robust registration procedures that can handle these data with varying geometric and radiometric properties. This paper aims to develop the traditional registration scenarios to reduce discrepancies between registered datasets in two dimensions (2D) space for remote sensing images. This is achieved by designing a computer program written in Visual Basic language following two main stages: The first stage is a traditional registration process by defining a set of control point pairs using manual selection, then comput the parameters of global affine transformation model to match them and resample the images. The second stage included matching process refinement by determining the shift value in control points (CPs) location depending on radiometric similarity measure. Then shift map technique was adjusted to adjust the process using 2nd order polynomial transformation function. This function has chosen after conducting statistical analyses, comparing between the common transformation functions (similarity, affine, projection and 2nd order polynomial). The results showed that the developed approach reduced the root mean square error (RMSE) of registration process and decreasing the discrepancies between registered datasets with 60%, 57% and 48% respectively for each one of the three tested datasets.
Digital image manipulation has become increasingly prevalent due to the widespread availability of sophisticated image editing tools. In copy-move forgery, a portion of an image is copied and pasted into another area within the same image. The proposed methodology begins with extracting the image's Local Binary Pattern (LBP) algorithm features. Two main statistical functions, Stander Deviation (STD) and Angler Second Moment (ASM), are computed for each LBP feature, capturing additional statistical information about the local textures. Next, a multi-level LBP feature selection is applied to select the most relevant features. This process involves performing LBP computation at multiple scales or levels, capturing textures at different
... Show MoreMost Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mo
... Show MoreThe main focus of this research is to examine the Travelling Salesman Problem (TSP) and the methods used to solve this problem where this problem is considered as one of the combinatorial optimization problems which met wide publicity and attention from the researches for to it's simple formulation and important applications and engagement to the rest of combinatorial problems , which is based on finding the optimal path through known number of cities where the salesman visits each city only once before returning to the city of departure n this research , the benefits of( FMOLP) algorithm is employed as one of the best methods to solve the (TSP) problem and the application of the algorithm in conjun
... Show MoreIn this paper, estimation of system reliability of the multi-components in stress-strength model R(s,k) is considered, when the stress and strength are independent random variables and follows the Exponentiated Weibull Distribution (EWD) with known first shape parameter θ and, the second shape parameter α is unknown using different estimation methods. Comparisons among the proposed estimators through Monte Carlo simulation technique were made depend on mean squared error (MSE) criteria
This paper proposes a completion that can allow fracturing four zones in a single trip in the well called “Y” (for confidential reasons) of the field named “X” (for confidential reasons). The steps to design a well completion for multiple fracturing are first to select the best completion method then the required equipment and the materials that it is made of. After that, the completion schematic must be drawn by using Power Draw in this case, and the summary installation procedures explained. The data used to design the completion are the well trajectory, the reservoir data (including temperature, pressure and fluid properties), the production and injection strategy. The results suggest that multi-stage hydraulic fracturing can
... Show More
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.
This paper is attempt to study the nonlinear second order delay multi-value problems. We want to say that the properties of such kind of problems are the same as the properties of those with out delay just more technically involved. Our results discuss several known properties, introduce some notations and definitions. We also give an approximate solution to the coined problems using the Galerkin's method.
The rise of edge-cloud continuum computing is a result of the growing significance of edge computing, which has become a complementary or substitute option for traditional cloud services. The convergence of networking and computers presents a notable challenge due to their distinct historical development. Task scheduling is a major challenge in the context of edge-cloud continuum computing. The selection of the execution location of tasks, is crucial in meeting the quality-of-service (QoS) requirements of applications. An efficient scheduling strategy for distributing workloads among virtual machines in the edge-cloud continuum data center is mandatory to ensure the fulfilment of QoS requirements for both customer and service provider. E
... Show More