Internet of Vehicle (IoV) is one of the most basic branches of the Internet of Things (IoT), which provides many advantages for drivers and passengers to ensure safety and traffic efficiency. Most IoV applications are delay-sensitive and require resources for data storage and computation that cannot be afforded by vehicles. Thus, such tasks are always offloaded to more powerful nodes, like cloud or fog. Vehicular Fog Computing (VFC), which extends cloud computing and brings resources closer to the edge of the network, has the potential to reduce both traffic congestion and load on the cloud. Resources management and allocation process is very critical for satisfying both user and provider needs. However, the strategy of task offloading to fog node in constraints of energy and latency is still an open issue. Several research works have tackled the resource scheduling problem in the field of VFC; however, the recent studies have not carefully addressed the transmission path to the destination node, nor has it considered the energy consumption of vehicles. This paper aims to optimize the task offloading process in the VFC system in terms of latency and energy objectives while taking the deadline constraint into considerations by adopting a Multi-Objective Evolutionary Algorithm (MOEA). Four different execution/transmission models are proposed where vehicle resources are utilized for tasks execution and transmission, and the well-known Dijkstra's algorithm is adopted to find the minimum path between each two nodes. The simulation results show that the models which involve the vehicles in the transmission process have reduced the latency and the total energy for the VFC system significantly in comparison with other models and the current state of the art methods.
Analyses are the power point of GIS because GIS can process and analyze different spatial and attribute data, leading to new results for supporting decision makers. The research aims to study advanced hydrologic analyses for the western part of Karbala province, Iraq. The hydrologic analysis is done based on where that DEM creates from the field survey method. This analysis gives digital maps and tables showing the region's main and minor hydrological properties, such as flow direction, flow accumulation, stream order, stream to feature, basin, and watershed maps. Also, it can be calculated as area, perimeter, lengths of streams, and numbers of stream orders for the main watersheds that are effective in the study area. These analy
... Show MoreIn the current worldwide health crisis produced by coronavirus disease (COVID-19), researchers and medical specialists began looking for new ways to tackle the epidemic. According to recent studies, Machine Learning (ML) has been effectively deployed in the health sector. Medical imaging sources (radiography and computed tomography) have aided in the development of artificial intelligence(AI) strategies to tackle the coronavirus outbreak. As a result, a classical machine learning approach for coronavirus detection from Computerized Tomography (CT) images was developed. In this study, the convolutional neural network (CNN) model for feature extraction and support vector machine (SVM) for the classification of axial
... Show MoreThe necessary optimality conditions with Lagrange multipliers are studied and derived for a new class that includes the system of Caputo–Katugampola fractional derivatives to the optimal control problems with considering the end time free. The formula for the integral by parts has been proven for the left Caputo–Katugampola fractional derivative that contributes to the finding and deriving the necessary optimality conditions. Also, three special cases are obtained, including the study of the necessary optimality conditions when both the final time and the final state are fixed. According to convexity assumptions prove that necessary optimality conditions are sufficient optimality conditions.
... Show MoreVarious theories have been proposed since in last century to predict the first sighting of a new crescent moon. None of them uses the concept of machine and deep learning to process, interpret and simulate patterns hidden in databases. Many of these theories use interpolation and extrapolation techniques to identify sighting regions through such data. In this study, a pattern recognizer artificial neural network was trained to distinguish between visibility regions. Essential parameters of crescent moon sighting were collected from moon sight datasets and used to build an intelligent system of pattern recognition to predict the crescent sight conditions. The proposed ANN learned the datasets with an accuracy of more than 72% in comp
... Show MoreIn this paper the experimentally obtained conditions for the fusion splicing with photonic crystal fibers (PCF) having large mode areas were reported. The physical mechanism of the splice loss and the microhole collapse property of photonic crystal fiber (PCF) were studied. By controlling the arc-power and the arc-time of a conventional electric arc fusion splicer (FSM-60S), the minimum loss of splicing for fusion two conventional single mode fibers (SMF-28) was (0.00dB), which has similar mode field diameter. For splicing PCF (LMA-10) with a conventional single mode fiber (SMF-28), the loss was increased due to the mode field mismatch.
Storing, transferring, and processing high-dimensional electroencephalogram (EGG) signals is a critical challenge. The goal of EEG compression is to remove redundant data in EEG signals. Medical signals like EEG must be of high quality for medical diagnosis. This paper uses a compression system with near-zero Mean Squared Error (MSE) based on Discrete Cosine Transform (DCT) and double shift coding for fast and efficient EEG data compression. This paper investigates and compares the use or non-use of delta modulation, which is applied to the transformed and quantized input signal. Double shift coding is applied after mapping the output to positive as a final step. The system performance is tested using EEG data files from the C
... Show MoreDensely deployment of sensors is generally employed in wireless sensor networks (WSNs) to ensure energy-efficient covering of a target area. Many sensors scheduling techniques have been recently proposed for designing such energy-efficient WSNs. Sensors scheduling has been modeled, in the literature, as a generalization of minimum set covering problem (MSCP) problem. MSCP is a well-known NP-hard optimization problem used to model a large range of problems arising from scheduling, manufacturing, service planning, information retrieval, etc. In this paper, the MSCP is modeled to design an energy-efficient wireless sensor networks (WSNs) that can reliably cover a target area. Unlike other attempts in the literature, which consider only a si
... Show MoreIn this paper, a numerical approximation for a time fractional one-dimensional bioheat equation (transfer paradigm) of temperature distribution in tissues is introduced. It deals with the Caputo fractional derivative with order for time fractional derivative and new mixed nonpolynomial spline for second order of space derivative. We also analyzed the convergence and stability by employing Von Neumann method for the present scheme.
In most manufacturing processes, and in spite of statistical control, several process capability indices refer to non conformance of the true mean (µc ) from the target mean ( µT ), and the variation is also high. In this paper, data have been analyzed and studied for a blow molded plastic product (Zahi Bottle) (ZB). WinQSB software was used to facilitate the statistical process control, and process capability analysis and some of capability indices. The relationship between different process capability indices and the true mean of the process were represented, and then with the standard deviation (σ ), of achievement of process capability value that can reduce the standard deviation value and improve production out of theoretical con
... Show More