Traffic management at road intersections is a complex requirement that has been an important topic of research and discussion. Solutions have been primarily focused on using vehicular ad hoc networks (VANETs). Key issues in VANETs are high mobility, restriction of road setup, frequent topology variations, failed network links, and timely communication of data, which make the routing of packets to a particular destination problematic. To address these issues, a new dependable routing algorithm is proposed, which utilizes a wireless communication system between vehicles in urban vehicular networks. This routing is position-based, known as the maximum distance on-demand routing algorithm (MDORA). It aims to find an optimal route on a hop-by-hop basis based on the maximum distance toward the destination from the sender and sufficient communication lifetime, which guarantee the completion of the data transmission process. Moreover, communication overhead is minimized by finding the next hop and forwarding the packet directly to it without the need to discover the whole route first. A comparison is performed between MDORA and ad hoc on-demand distance vector (AODV) protocol in terms of throughput, packet delivery ratio, delay, and communication overhead. The outcome of the proposed algorithm is better than that of AODV.
Stereolithography (SLA) has become an essential photocuring 3D printing process for producing parts of complex shapes from photosensitive resin exposed to UV light. The selection of the best printing parameters for good accuracy and surface quality can be further complicated by the geometric complexity of the models. This work introduces multiobjective optimization of SLA printing of 3D dental bridges based on simple CAD objects. The effect of the best combination of a low-cost resin 3D printer’s machine parameter settings, namely normal exposure time, bottom exposure time and bottom layers for less dimensional deviation and surface roughness, was studied. A multiobjective optimization method was utilized, combining the Taguchi me
... Show MoreBipedal robotic mechanisms are unstable due to the unilateral contact passive joint between the sole and the ground. Hierarchical control layers are crucial for creating walking patterns, stabilizing locomotion, and ensuring correct angular trajectories for bipedal joints due to the system’s various degrees of freedom. This work provides a hierarchical control scheme for a bipedal robot that focuses on balance (stabilization) and low-level tracking control while considering flexible joints. The stabilization control method uses the Newton–Euler formulation to establish a mathematical relationship between the zero-moment point (ZMP) and the center of mass (COM), resulting in highly nonlinear and coupled dynamic equations. Adaptiv
... Show MoreMost heuristic search method's performances are dependent on parameter choices. These parameter settings govern how new candidate solutions are generated and then applied by the algorithm. They essentially play a key role in determining the quality of the solution obtained and the efficiency of the search. Their fine-tuning techniques are still an on-going research area. Differential Evolution (DE) algorithm is a very powerful optimization method and has become popular in many fields. Based on the prolonged research work on DE, it is now arguably one of the most outstanding stochastic optimization algorithms for real-parameter optimization. One reason for its popularity is its widely appreciated property of having only a small number of par
... Show MoreThe Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show MoreThis study aims to enhance the RC5 algorithm to improve encryption and decryption speeds in devices with limited power and memory resources. These resource-constrained applications, which range in size from wearables and smart cards to microscopic sensors, frequently function in settings where traditional cryptographic techniques because of their high computational overhead and memory requirements are impracticable. The Enhanced RC5 (ERC5) algorithm integrates the PKCS#7 padding method to effectively adapt to various data sizes. Empirical investigation reveals significant improvements in encryption speed with ERC5, ranging from 50.90% to 64.18% for audio files and 46.97% to 56.84% for image files, depending on file size. A substanti
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreVol. 6, Issue 1 (2025)
Dust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show More