As cities across the world grow and the mobility of populations increases, there has also been a corresponding increase in the number of vehicles on roads. The result of this has been a proliferation of challenges for authorities with regard to road traffic management. A consequence of this has been congestion of traffic, more accidents, and pollution. Accidents are a still major cause of death, despite the development of sophisticated systems for traffic management and other technologies linked with vehicles. Hence, it is necessary that a common system for accident management is developed. For instance, traffic congestion in most urban areas can be alleviated by the real-time planning of routes. However, the designing of an efficie
... Show MoreCyber security is a term utilized for describing a collection of technologies, procedures, and practices that try protecting an online environment of a user or an organization. For medical images among most important and delicate data kinds in computer systems, the medical reasons require that all patient data, including images, be encrypted before being transferred over computer networks by healthcare companies. This paper presents a new direction of the encryption method research by encrypting the image based on the domain of the feature extracted to generate a key for the encryption process. The encryption process is started by applying edges detection. After dividing the bits of the edge image into (3×3) windows, the diffusions
... Show MoreNanofluids (i.e. nanoparticles dispersed in a fluid) have tremendous potential in a broad range of applications, including pharmacy, medicine, water treatment, soil decontamination, or oil recovery and CO2 geo-sequestration. In these applications nanofluid stability plays a key role, and typically robust stability is required. However, the fluids in these applications are saline, and no stability data is available for such salt-containing fluids. We thus measured and quantified nanofluid stability for a wide range of nanofluid formulations, as a function of salinity, nanoparticle content and various additives, and we investigated how this stability can be improved. Zeta sizer and dynamic light scattering (DLS) principles were used to invest
... Show MoreIn the task of detecting intrinsic plagiarism, the cases where reference corpus is absent are to be dealt with. This task is entirely based on inconsistencies within a given document. Detection of internal plagiarism has been considered as a classification problem. It can be estimated through taking into consideration self-based information from a given document.
The core contribution of the work proposed in this paper is associated with the document representation. Wherein, the document, also, the disjoint segments generated from it, have been represented as weight vectors demonstrating their main content. Where, for each element in these vectors, its average weight has been considered instead of its frequency.
Th
... Show MoreSpraying pesticides is one of the most common procedures that is conducted to control pests. However, excessive use of these chemicals inversely affects the surrounding environments including the soil, plants, animals, and the operator itself. Therefore, researchers have been encouraged to...
Because of the rapid development and use of the Internet as a communication media emerged to need a high level of security during data transmission and one of these ways is "Steganography". This paper reviews the Least Signification Bit steganography used for embedding text file with related image in gray-scale image. As well as we discuss the bit plane which is divided into eight different images when combination them we get the actual image. The findings of the research was the stego-image is indistinguishable to the naked eye from the original cover image when the value of bit less than four Thus we get to the goal is to cover up the existence of a connection or hidden data. The Peak to Signal Noise Ratio(PSNR) and Mean Square Error (
... Show MoreMost Internet of Vehicles (IoV) applications are delay-sensitive and require resources for data storage and tasks processing, which is very difficult to afford by vehicles. Such tasks are often offloaded to more powerful entities, like cloud and fog servers. Fog computing is decentralized infrastructure located between data source and cloud, supplies several benefits that make it a non-frivolous extension of the cloud. The high volume data which is generated by vehicles’ sensors and also the limited computation capabilities of vehicles have imposed several challenges on VANETs systems. Therefore, VANETs is integrated with fog computing to form a paradigm namely Vehicular Fog Computing (VFC) which provide low-latency services to mo
... Show More