There are many methods of searching large amount of data to find one particular piece of information. Such as find name of person in record of mobile. Certain methods of organizing data make the search process more efficient the objective of these methods is to find the element with least cost (least time). Binary search algorithm is faster than sequential and other commonly used search algorithms. This research develops binary search algorithm by using new structure called Triple, structure in this structure data are represented as triple. It consists of three locations (1-Top, 2-Left, and 3-Right) Binary search algorithm divide the search interval in half, this process makes the maximum number of comparisons (Average case complexity of Search) is O(log2 n) (pronounce this "big-Oh-n" or "the order of magnitude"), if we search in a list consists of (N) elements. In this research the number of comparison is reduced to triple by using Triple structure, this process makes the maximum number of comparisons is O(log2 (n)/3+1) if we search key in list consist of (N) elements.
The logistic regression model of the most important regression models a non-linear which aim getting estimators have a high of efficiency, taking character more advanced in the process of statistical analysis for being a models appropriate form of Binary Data.
Among the problems that appear as a result of the use of some statistical methods I
... Show MoreRefractive indices (nD), viscosities (η) and densities (r) were deliberated for the binary mixtures created by dipropyl amine with 1-octanol, 1-heptanol, 1-hexanol, 1-pentanol and tert-pentyl alcohol at temperature 298.15 K over the perfect installation extent. The function of Redlich-Kister were used to calculate and renovated of the refractive index deviations (∆nD), viscosity deviations (ηE), excess molar Gibbs free energy (∆G*E) and excess molar volumes(Vm E). The standard errors and coefficients were respected by this function. The values of ∆nD, ηE, Vm E and ∆G*E were plotted against mole fraction of dipropyl amine. In all cases the obtained ηE, ∆G*E, Vm E and ∆nD values were negative at 298.15K. Effect of carbon atoms
... Show MoreThe removal of heavy metal ions from wastewater by sorptive flotation using Amberlite IR120 as a resin, and flotation column, was investigated. A combined two-stage process is proposed as an alternative of the heavy metals removal from aqueous solutions. The first stage is the sorption of heavy metals onto Amberlite IR120 followed by dispersed-air flotation. The sorption of metal ions on the resin, depending on contact time, pH, resin dosage, and initial metal concentration was studied in batch method .Various parameters such as pH, air flow rate, and surfactant concentration were investigated in the flotation stage. Sodium lauryl sulfate (SLS) and Hexadecyltrimethyl ammonium bromide (HTAB) were used as anionic and cationic surfactant re
... Show MoreRefractive indices (nD), viscosities (η) and densities (ρ) were deliberated for the binary mixtures created by dipropyl amine with 1-octanol, 1-heptanol, 1-hexanol, 1-pentanol and tert-pentyl alcohol at temperature 298.15 K over the perfect installation extent. The function of Redlich-Kister were used to calculate and renovated of the refractive index deviations (∆nD), viscosity deviations (ηE), excess molar Gibbs free energy (∆G*E) and excess molar volumes (VmE) The standard errors and coefficients were respected by this function. The values of ∆nD, ηE, VmE and ∆G*E were plotted against mole fraction of dipropyl amine. In all cases the obtained ηE, ∆G*E, VmE and ∆nD values were negative at 298.15K. Effect of carbo
... Show MoreThe research aims to diagnose the causes of the phenomenon of Marketing deception catalog, which is now deployed in the Iraqi market and related to producers and marketers, consumers, regulators and other institutions) and their impact in the areas of prejudice to the consumer protection (product and signifying specifications, price, advertising, packaging), as well as identify differences in the sample responses according to personal variables, it has been the adoption of the resolution as a tool to collect data and information through a sample survey of consumer opinions totaling 108 people in shopping centers in the province of Baghdad and in the Karkh and Rusafa, It was the use of methods selected statistical represented by the arith
... Show MoreAbstract
In this work, two algorithms of Metaheuristic algorithms were hybridized. The first is Invasive Weed Optimization algorithm (IWO) it is a numerical stochastic optimization algorithm and the second is Whale Optimization Algorithm (WOA) it is an algorithm based on the intelligence of swarms and community intelligence. Invasive Weed Optimization Algorithm (IWO) is an algorithm inspired by nature and specifically from the colonizing weeds behavior of weeds, first proposed in 2006 by Mehrabian and Lucas. Due to their strength and adaptability, weeds pose a serious threat to cultivated plants, making them a threat to the cultivation process. The behavior of these weeds has been simulated and used in Invas
... Show MoreFuture generations of wireless networks are expected to heavily rely on unmanned aerial vehicles (UAVs). UAV networks have extraordinary features like high mobility, frequent topology change, tolerance to link failure, and extending the coverage area by adding external UAVs. UAV network provides several advantages for civilian, commercial, search and rescue applications. A realistic mobility model must be used to assess the dependability and effectiveness of UAV protocols and algorithms. In this research paper, the performance of the Gauss Markov (GM) and Random Waypoint (RWP) mobility models in multi-UAV networks for a search and rescue scenario is analyzed and evaluated. Additionally, the two mobility models GM and RWP are descr
... Show MoreJPEG is most popular image compression and encoding, this technique is widely used in many applications (images, videos and 3D animations). Meanwhile, researchers are very interested to develop this massive technique to compress images at higher compression ratios with keeping image quality as much as possible. For this reason in this paper we introduce a developed JPEG based on fast DCT and removed most of zeros and keeps their positions in a transformed block. Additionally, arithmetic coding applied rather than Huffman coding. The results showed up, the proposed developed JPEG algorithm has better image quality than traditional JPEG techniques.
The Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show More