This work bases on encouraging a generous and conceivable estimation for modified an algorithm for vehicle travel times on a highway from the eliminated traffic information using set aside camera image groupings. The strategy for the assessment of vehicle travel times relies upon the distinctive verification of traffic state. The particular vehicle velocities are gotten from acknowledged vehicle positions in two persistent images by working out the distance covered all through elapsed past time doing mollification between the removed traffic flow data and cultivating a plan to unequivocally predict vehicle travel times. Erbil road data base is used to recognize road locales around road segments which are projected into the commended camera images and later distinguished vehicles are assigned to the looking at route segment so instantaneous and current velocities are calculated. All data were effectively processed and visualized using both MATLAB and Python programming language and its libraries.
Loud noise can be extremely harmful to the auditory system as well as to human health. Noise pollution is primarily caused by traffic noise. The study's goal was to determine how various vehicle types and speeds affected the amount of noise generated by traffic. The two factors were investigated at seven different arterial streets throughout Kirkuk city to measure the noise levels. The measurements were performed during peak hours to compare the result with WHO standards for noise specification. Traffic volume and vehicle speed are shown to be the key elements that determine an increase in noise level.
In this research, we will discuss how to improve the work by dealing with the factors that
participates in enhancing small IT organization to produce the software using the suitable
development process supported by experimental theories to achieve the goals. Starting from
the selecting of the methodology to implement the software. The steps used are and should be
compatible with the type of the products the organization will produce and here it is the Web-Based Project Development.
The researcher suggest Extreme Programming (XP) as a methodology for the Web-Based
Project Development and justifying this suggestion and that will guide to know how the
methodology is very important and effective in the software dev
Big data usually running in large-scale and centralized key management systems. However, the centralized key management systems are increasing the problems such as single point of failure, exchanging a secret key over insecure channels, third-party query, and key escrow problem. To avoid these problems, we propose an improved certificate-based encryption scheme that ensures data confidentiality by combining symmetric and asymmetric cryptography schemes. The combination can be implemented by using the Advanced Encryption Standard (AES) and Elliptic Curve Diffie-Hellman (ECDH). The proposed scheme is an enhanced version of the Certificate-Based Encryption (CBE) scheme and preserves all its advantages. However
... Show MoreUndoubtedly, Road Traffic Accidents (RTAs) are a major dilemma in term of mortality and morbidity facing the road users as well as the traffic and road authorities. Since 2002, the population in Iraq has increased by 49 percent and the number of vehicles by three folds. Consequently, these increases were unfortunately combined with rising the RTAs number, mortality and morbidity. Alongside the humanitarian tragedies, every year, there are considerable economic losses in Iraq lost due to the epidemic of RTAs. Given the necessity of understanding the contributory factors related to RTAs for the implementation by traffic and road authorities to improve the road safety, the necessity have been a rise for
... Show MoreFuzzy logic is used to solve the load flow and contingency analysis problems, so decreasing computing time and its the best selection instead of the traditional methods. The proposed method is very accurate with outstanding computation time, which made the fuzzy load flow (FLF) suitable for real time application for small- as well as large-scale power systems. In addition that, the FLF efficiently able to solve load flow problem of ill-conditioned power systems and contingency analysis. The FLF method using Gaussian membership function requires less number of iterations and less computing time than that required in the FLF method using triangular membership function. Using sparsity technique for the input Ybus sparse matrix data gi
... Show MoreAt present, smooth movement on the roads is a matter which is needed for each user. Many roads, especially in urban areas geometrically improved because of the number of vehicles increase from time to time.
In this research, Highway capacity software, HCS, 2000, will be adopted to determine the effectiveness of roundabout in terms of capacity of roundabout, delay and level of service of roundabout.
The results of the analysis indicated that the Ahmed Urabi roundabout operates under level of service F with an average control delay of 300 seconds per vehicle during the peak hours.
The through movements of Alkarrada- Aljadiriya direction (Major Direction) represent the heaviest traff
... Show MoreIn this article, we design an optimal neural network based on new LM training algorithm. The traditional algorithm of LM required high memory, storage and computational overhead because of it required the updated of Hessian approximations in each iteration. The suggested design implemented to converts the original problem into a minimization problem using feed forward type to solve non-linear 3D - PDEs. Also, optimal design is obtained by computing the parameters of learning with highly precise. Examples are provided to portray the efficiency and applicability of this technique. Comparisons with other designs are also conducted to demonstrate the accuracy of the proposed design.
Objective of this work is the mixing between human biometric characteristics and unique attributes of the computer in order to protect computer networks and resources environments through the development of authentication and authorization techniques. In human biometric side has been studying the best methods and algorithms used, and the conclusion is that the fingerprint is the best, but it has some flaws. Fingerprint algorithm has been improved so that their performance can be adapted to enhance the clarity of the edge of the gully structures of pictures fingerprint, taking into account the evaluation of the direction of the nearby edges and repeat. In the side of the computer features, computer and its components like human have uniqu
... Show MoreOptimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received s
... Show More