Routing protocols are responsible for providing reliable communication between the source and destination nodes. The performance of these protocols in the ad hoc network family is influenced by several factors such as mobility model, traffic load, transmission range, and the number of mobile nodes which represents a great issue. Several simulation studies have explored routing protocol with performance parameters, but few relate to various protocols concerning routing and Quality of Service (QoS) metrics. This paper presents a simulation-based comparison of proactive, reactive, and multipath routing protocols in mobile ad hoc networks (MANETs). Specifically, the performance of AODV, DSDV, and AOMDV protocols are evaluated and analyzed in the presence of varying the number of mobile nodes, pause time, and traffic connection numbers. Moreover, Routing and QoS performance metrics such as normalized routing load, routing packet, packet delivery ratio, packet drop, end-to-end delay, and throughput are measured to conduct a performance comparison between three routing protocols. Simulation results indicate that AODV outperforms the DSDV and AOMDV protocols in most of the metrics. AOMDV is better than DSDV in terms of end-to-end delay. DSDV provides lower throughput performance results. Network topology parameters have a slight impact on AODV Performance.
In this paper, a comparison between horizontal and vertical OFET of Poly (3-Hexylthiophene) (P3HT) as an active semiconductor layer (p-type) was studied by using two different gate insulators (ZrO2 and PVA). The electrical performance output (Id-Vd) and transfer (Id-Vg) characteristics were investigated using the gradual-channel approximation model. The device shows a typical output curve of a field-effect transistor (FET). The analysis of electrical characterization was performed in order to investigate the source-drain voltage (Vd) dependent current and the effects of gate dielectric on the electrical performance of the OFET. This work also considered the effects of the capacitance semiconductor on the performance OFETs. The value
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreIn this research، a comparison has been made between the robust estimators of (M) for the Cubic Smoothing Splines technique، to avoid the problem of abnormality in data or contamination of error، and the traditional estimation method of Cubic Smoothing Splines technique by using two criteria of differentiation which are (MADE، WASE) for different sample sizes and disparity levels to estimate the chronologically different coefficients functions for the balanced longitudinal data which are characterized by observations obtained through (n) from the independent subjects، each one of them is measured repeatedly by group of specific time points (m)،since the frequent measurements within the subjects are almost connected an
... Show MoreIn this paper, double Sumudu and double Elzaki transforms methods are used to compute the numerical solutions for some types of fractional order partial differential equations with constant coefficients and explaining the efficiently of the method by illustrating some numerical examples that are computed by using Mathcad 15.and graphic in Matlab R2015a.
Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show More
ABSTRACT
The researcher seeks to shed light on the relationship analysis and the impact between organizational values in all its dimensions (Administration Management, Mission, relationship management, environmental management) and strategic performance (financial perspective, customer perspective, the perspective of internal processes, learning and development) in the presidency of Two Universities of Baghdad & Al-Nahrain, it has been formulating three hypotheses for this purpose.
The main research problem has been the following question: Is there a relationship and the impact of bet
... Show MoreThe predilection for 5G telemedicine networks has piqued the interest of industry researchers and academics. The most significant barrier to global telemedicine adoption is to achieve a secure and efficient transport of patients, which has two critical responsibilities. The first is to get the patient to the nearest hospital as quickly as possible, and the second is to keep the connection secure while traveling to the hospital. As a result, a new network scheme has been suggested to expand the medical delivery system, which is an agile network scheme to securely redirect ambulance motorbikes to the nearest hospital in emergency cases. This research provides a secured and efficient telemedicine transport strategy compatible with the
... Show MorePlagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show MoreIn this paper, we implement and examine a Simulink model with electroencephalography (EEG) to control many actuators based on brain waves. This will be in great demand since it will be useful for certain individuals who are unable to access some control units that need direct contact with humans. In the beginning, ten volunteers of a wide range of (20-66) participated in this study, and the statistical measurements were first calculated for all eight channels. Then the number of channels was reduced by half according to the activation of brain regions within the utilized protocol and the processing time also decreased. Consequently, four of the participants (three males and one female) were chosen to examine the Simulink model during di
... Show More