Most Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mobile vehicles. Several studies have tackled the task offloading problem in the VFC field. However, recent studies have not carefully addressed the transmission path to the destination node and did not consider the energy consumption of vehicles. This paper aims to optimize the task offloading process in the VFC system in terms of latency and energy objectives under deadline constraint by adopting a Multi-Objective Evolutionary Algorithm (MOEA). Road Side Units (RSUs) x-Vehicles Mutli-Objective Computation offloading method (RxV-MOC) is proposed, where an elite of vehicles are utilized as fog nodes for tasks execution and all vehicles in the system are utilized for tasks transmission. The well-known Dijkstra's algorithm is adopted to find the minimum path between each two nodes. The simulation results show that the RxV-MOC has reduced significantly the energy consumption and latency for the VFC system in comparison with First-Fit algorithm, Best-Fit algorithm, and the MOC method.
Background: Patients requiring renal biopsies have various glomerular diseases according to their demographic characteristics.
Objective: To study types of glomerular disease among adult Iraqi patients in a single center in Baghdad/Iraq
Material and Methods: A total of 120 native kidney biopsies were studied. All biopsies were adequate and were processed for Light Microscopy.
The age range of the study patients was 17-67 years, with a mean of 38.5 years. The mean follow up period was 28 weeks (4-52 weeks)
Indication for biopsy included: Nephrotic syndrome (N=72; 60%), Asymptomatic proteinuria (N=21; 17.5%), acute nephritic presentation (N=17; 14.16%), asymptomatic haematuria (N=10; 8.33%).
Results: Primary glomerulonephrit
Due to the continuous development in society and the multiplicity of customers' desires and their keeping pace with this development and their search for the quality and durability of the commodity that provides them with the best performance and that meets their needs and desires, all this has led to the consideration of quality as one of the competitive advantages that many industrial companies compete for and which are of interest to customers and are looking for. The research problem showed that the Diyala State Company for Electrical Industries relies on some simple methods and personal experience to monitor the quality of products and does not adopt scientific methods and modern programs. The aim of this research is to desi
... Show MoreThe growing use of tele
This paper presents a new secret diffusion scheme called Round Key Permutation (RKP) based on the nonlinear, dynamic and pseudorandom permutation for encrypting images by block, since images are considered particular data because of their size and their information, which are two-dimensional nature and characterized by high redundancy and strong correlation. Firstly, the permutation table is calculated according to the master key and sub-keys. Secondly, scrambling pixels for each block to be encrypted will be done according the permutation table. Thereafter the AES encryption algorithm is used in the proposed cryptosystem by replacing the linear permutation of ShiftRows step with the nonlinear and secret pe
... Show MoreToday in the digital realm, where images constitute the massive resource of the social media base but unfortunately suffer from two issues of size and transmission, compression is the ideal solution. Pixel base techniques are one of the modern spatially optimized modeling techniques of deterministic and probabilistic bases that imply mean, index, and residual. This paper introduces adaptive pixel-based coding techniques for the probabilistic part of a lossy scheme by incorporating the MMSA of the C321 base along with the utilization of the deterministic part losslessly. The tested results achieved higher size reduction performance compared to the traditional pixel-based techniques and the standard JPEG by about 40% and 50%,
... Show MoreThe researchers wanted to make a new azo imidazole as a follow-up to their previous work. The ligand 4-[(2-Amino-4-phenylazo)-methyl]-cyclohexane carboxylic acid as a derivative of trans-4-(aminomethyl) cyclohexane carboxylic acid diazonium salt, and synthesis a series of its chelate complexes with metalions, characterized these compounds using a variety technique, including elemental analysis, FTIR, LC-Mass, 1H-NMRand UV-Vis spectral process as well TGA, conductivity and magnetic quantifications. Analytical data showed that the Co (II) complex out to 1:1 metal-ligand ratio with square planner and tetrahedral geometry, respectively while 1:2 metal-ligand ratio in the Cu(II), Cr(III), Mn(II), Zn(II), Ru(III)and Rh(III)complexes
... Show MoreMicro-perforated panel (MPP) absorber is increasingly gaining popularity as an alternative sound absorber in buildings compared to the well-known synthetic porous materials. A single MPP has a typical feature of a Helmholtz resonator with a high amplitude of absorption but a narrow absorption frequency bandwidth. To improve the bandwidth, a single MPP can be cascaded with another single MPP to form a double-layer MPP. This paper proposes the introduction of inhomogeneous perforation in the double-layer MPP system (DL-iMPP) to enhance the absorption bandwidth of a double-layer MPP. Mathematical models are proposed using the equivalent electrical circuit model and are validated with experiments with good agreement. It is revealed that the DL-
... Show MoreThis research was aimed to evaluate activity of Rosemary volatile oil and Nisin A in vivo and on B. cereus isolated from some canned meat products in vitro. The results showed that the activity of Rosemary volatile oil (2000 µg/ml) and Nisin A (350 µg\ml) attained to 27 and 19 mm inhibitory zone diameter respectively in well diffusion method. The viable plate count from samples of canned meat treated with effective concentration of Rosemary volatile oil and Nisin A were examined. The samples with Rosemary volatile oil was not showed any CFU/g after 9 days of preservation while sample with Nisin A and control observed 49 and 45 CFU/g respectively. In vivo experiment on mice, two weeks after oral dose of Rosemary volatile oil (2000
... Show MoreSoil wetted pattern from a subsurface drip plays great importance in the design of subsurface drip irrigation (SDI) system for delivering the required water directly to the roots of the plant. An equation to estimate the dimensions of the wetted area in soil are taking into account water uptake by roots is simulated numerically using HYDRUS (2D/3D) software. In this paper, three soil textures namely loamy sand, sandy loam, and loam soil were used with three different types of crops tomato, pepper, and cucumber, respectively, and different values of drip discharge, drip depth, and initial soil moisture content were proposed. The soil wetting patterns were obtained at every thirty minutes for a total time of irrigation equ
... Show MoreThe Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show More