The multicast technology implements a very high-efficiency point-to-multipoint data transmission over IP networks (IPv4 and IPv6). Multicast reduces network load, eliminates traffic redundancy, and saves network bandwidth. Therefore, multicast used widely in LAN/WAN applications such as online games, video conferencing and IPTV. The multicast technology implements varied protocols such as DVMRP(Distance Vector Multicast Routing Protocol), MOSPF(Multicast Open Shortest Path First), or PIM-DM (Protocol Independent Multicast- Dense Mode) which considered source tree type, while PIM-SM (Protocol Independent Multicast- Sparse Mode) and CBT (Core Based Tree) uses shared tree. Current paper focuses on the performance evaluation of the two multicast protocols: PIM-SMv4 and PIM-SMv6 based on QoS metrics like throughput, jitter, datagram loss and Data received. PIM-SM over IPv6 showed good results compared with PIM-SM over IPv4 by 4..1%, ...1%, 65.2.% and 98.91% in terms of data received, throughput, jitter and datagram loss respectively .GNS3 simulator/emulator and JPERF used to evaluate this performance
Moment invariants have wide applications in image recognition since they were proposed.
Metaheuristic is one of the most well-known fields of research used to find optimum solutions for non-deterministic polynomial hard (NP-hard) problems, for which it is difficult to find an optimal solution in a polynomial time. This paper introduces the metaheuristic-based algorithms and their classifications and non-deterministic polynomial hard problems. It also compares the performance of two metaheuristic-based algorithms (Elephant Herding Optimization algorithm and Tabu Search) to solve the Traveling Salesman Problem (TSP), which is one of the most known non-deterministic polynomial hard problems and widely used in the performance evaluations for different metaheuristics-based optimization algorithms. The experimental results of Ele
... Show MoreIn this study, structures damage identification method based on changes in the dynamic characteristics
(frequencies) of the structure are examined, stiffness as well as mass matrices of the curved
(in and out-of-plane vibration) beam elements is formulated using Hamilton's principle. Each node
of both of them possesses seven degrees of freedom including the warping degree of freedom. The
curved beam element had been derived based on the Kang and Yoo’s thin-walled curved beam theory
in 1994. A computer program was developing to carry out free vibration analyses of the curved
beam as well as straight beam. Comparing with the frequencies for other researchers using the general
purpose program MATLAB. Fuzzy logic syste
Human detection represents a main problem of interest when using video based monitoring. In this paper, artificial neural networks, namely multilayer perceptron (MLP) and radial basis function (RBF) are used to detect humans among different objects in a sequence of frames (images) using classification approach. The classification used is based on the shape of the object instead of depending on the contents of the frame. Initially, background subtraction is depended to extract objects of interest from the frame, then statistical and geometric information are obtained from vertical and horizontal projections of the objects that are detected to stand for the shape of the object. Next to this step, two ty
... Show MoreThe objective of the study is to demonstrate the predictive ability is better between the logistic regression model and Linear Discriminant function using the original data first and then the Home vehicles to reduce the dimensions of the variables for data and socio-economic survey of the family to the province of Baghdad in 2012 and included a sample of 615 observation with 13 variable, 12 of them is an explanatory variable and the depended variable is number of workers and the unemployed.
Was conducted to compare the two methods above and it became clear by comparing the logistic regression model best of a Linear Discriminant function written
... Show MoreIn this paper, a new modification was proposed to enhance the security level in the Blowfish algorithm by increasing the difficulty of cracking the original message which will lead to be safe against unauthorized attack. This algorithm is a symmetric variable-length key, 64-bit block cipher and it is implemented using gray scale images of different sizes. Instead of using a single key in cipher operation, another key (KEY2) of one byte length was used in the proposed algorithm which has taken place in the Feistel function in the first round both in encryption and decryption processes. In addition, the proposed modified Blowfish algorithm uses five Sboxes instead of four; the additional key (KEY2) is selected randomly from additional Sbox
... Show MoreIn this paper, the process of comparison between the tree regression model and the negative binomial regression. As these models included two types of statistical methods represented by the first type "non parameter statistic" which is the tree regression that aims to divide the data set into subgroups, and the second type is the "parameter statistic" of negative binomial regression, which is usually used when dealing with medical data, especially when dealing with large sample sizes. Comparison of these methods according to the average mean squares error (MSE) and using the simulation of the experiment and taking different sample
... Show More إن المقصود باختبارات حسن المطابقة هو التحقق من فرضية العدم القائمة على تطابق مشاهدات أية عينة تحت الدراسة لتوزيع احتمالي معين وترد مثل هكذا حالات في التطبيق العملي بكثرة وفي كافة المجالات وعلى الأخص بحوث علم الوراثة والبحوث الطبية والبحوث الحياتية ,عندما اقترح كلا من Shapiro والعالم Wilk عام 1965 اختبار حسن المطابقة الحدسي مع معالم القياس
(
The research dealt with the study of choice between a range of different words between the repetitive texts in the verses of the Koran and explain the reason behind this choice, and how the context has the greatest impact in this choice, and that each word in the Koran was placed in the most appropriate place, which can not be replaced by equivalent words, whatever The degree of similarity between them in the indication of the other because it remains at the end that each of those vocabulary synonyms are indicative, the first of them especially that makes the possibility of replacing them with the equivalent vocabulary impossible, and the second general significance and this connotation makes them share with their peers In one aspect of
... Show More