It is frequently asserted that an advantage of a binary search tree implementation of a set over linked list implementation is that for reasonably well balanced binary search trees the average search time (to discover whether or not a particular element is present in the set) is O(log N) to the base 2 where N is the number of element in the set (the size of the tree). This paper presents an experiment for measuring and comparing the obtained binary search tree time with the expected time (theoretical), this experiment proved the correctness of the hypothesis, the experiment is carried out using a program in turbo Pascal with recursion technique implementation and a statistical method to prove the above hypothesis. Search time is estimated by the number of comparisons needed.
In this paper, suggested method as well as the conventional methods (probability
plot-(p.p.) for estimations of the two-parameters (shape and scale) of the Weibull
distribution had proposed and the estimators had been implemented for different
sample sizes small, medium, and large of size 20, 50, and 100 respectively by
simulation technique. The comparisons were carried out between different methods
and sample sizes. It was observed from the results that suggested method which
were performed for the first time (as far as we know), by using MSE indicator, the
comparisons between the studied and suggested methods can be summarized
through extremely asymptotic for indicator (MSE) results by generating random
error
It is so much noticeable that initialization of architectural parameters has a great impact on whole learnability stream so that knowing mathematical properties of dataset results in providing neural network architecture a better expressivity and capacity. In this paper, five random samples of the Volve field dataset were taken. Then a training set was specified and the persistent homology of the dataset was calculated to show impact of data complexity on selection of multilayer perceptron regressor (MLPR) architecture. By using the proposed method that provides a well-rounded strategy to compute data complexity. Our method is a compound algorithm composed of the t-SNE method, alpha-complexity algorithm, and a persistence barcod
... Show MoreThere is no doubt that Jane Austen is one of the most studied authors of the late 18th and early 19th centuries. Her female characters have been extensively studied and they seem to have aroused much interest as manifestations of the conduct of their time. Her heroines have realized that there were many mistakes in the rules of conduct that controlled and restricted their behaviors. Thus, they have found no fault in correcting these mistakes, by behaving naturally without acting. Elizabeth Bennet the heroine of Pride and Prejudice and Marianne Dashwood of Sense and Sensibility are the chosen examples of that kind of women.
The financial markets are one of the sectors whose data is characterized by continuous movement in most of the times and it is constantly changing, so it is difficult to predict its trends , and this leads to the need of methods , means and techniques for making decisions, and that pushes investors and analysts in the financial markets to use various and different methods in order to reach at predicting the movement of the direction of the financial markets. In order to reach the goal of making decisions in different investments, where the algorithm of the support vector machine and the CART regression tree algorithm are used to classify the stock data in order to determine
... Show MoreFor many years, reading rate as word correct per minute (WCPM) has been investigated by many researchers as an indicator of learners’ level of oral reading speed, accuracy, and comprehension. The aim of the study is to predict the levels of WCPM using three machine learning algorithms which are Ensemble Classifier (EC), Decision Tree (DT), and K- Nearest Neighbor (KNN). The data of this study were collected from 100 Kurdish EFL students in the 2nd-year, English language department, at the University of Duhok in 2021. The outcomes showed that the ensemble classifier (EC) obtained the highest accuracy of testing results with a value of 94%. Also, EC recorded the highest precision, recall, and F1 scores with values of 0.92 for
... Show MoreA load flow program is developed using MATLAB and based on the Newton–Raphson method,which shows very fast and efficient rate of convergence as well as computationally the proposed method is very efficient and it requires less computer memory through the use of sparsing method and other methods in programming to accelerate the run speed to be near the real time.
The designed program computes the voltage magnitudes and phase angles at each bus of the network under steady–state operating conditions. It also computes the power flow and power losses for all equipment, including transformers and transmission lines taking into consideration the effects of off–nominal, tap and phase shift transformers, generators, shunt capacitors, sh
In this modern Internet era and the transition to IPv6, routing protocols must adjust to assist this transformation. RIPng, EIGRPv6 and OSPFv3 are the dominant IPv6 IGRP (Interior Gateway Routing Protocols). Selecting the best routing protocol among the available is a critical task, which depends upon the network requirement and performance parameters of different real time applications. The primary motivation of this paper is to estimate the performance of these protocols in real time applications. The evaluation is based on a number of criteria including: network convergence duration, Http Page Response Time, DB Query Response Time, IPv6 traffic dropped, video packet delay variation and video packet end to end de
... Show MoreOptimizing the Access Point (AP) deployment is of great importance in wireless applications owing the requirement to provide efficient and cost-effective communication. Highly targeted by many researchers and academic industries, Quality of Service (QOS) is an important primary parameter and objective in mind along with AP placement and overall publishing cost. This study proposes and investigates a multi-level optimization algorithm based on Binary Particle Swarm Optimization (BPSO). It aims to an optimal multi-floor AP placement with effective coverage that makes it more capable of supporting QOS and cost effectiveness. Five pairs (coverage, AP placement) of weights, signal threshol
Optimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received s
... Show More