Internet paths sharing the same congested link can be identified using several shared congestion detection techniques. The new detection technique which is proposed in this paper depends on the previous novel technique (delay correlation with wavelet denoising (DCW) with new denoising method called Discrete Multiwavelet Transform (DMWT) as signal denoising to separate between queuing delay caused by network congestion and delay caused by various other delay variations. The new detection technique provides faster convergence (3 to 5 seconds less than previous novel technique) while using fewer probe packets approximately half numbers than the previous novel technique, so it will reduce the overload on the network caused by probe packets. Thus, new detection technique will improve the overall performance of computer network.
Rutting in asphalt mixtures is a very common type of distress. It occurs due to the heavy load applied and slow movement of traffic. Rutting needs to be predicted to avoid major deformation to the pavement. A simple linear viscous method is used in this paper to predict the rutting in asphalt mixtures by using a multi-layer linear computer programme (BISAR). The material properties were derived from the Repeated Load Axial Test (RLAT) and represented by a strain-dependent axial viscosity. The axial viscosity was used in an incremental multi-layer linear viscous analysis to calculate the deformation rate during each increment, and therefore the overall development of rutting. The method has been applied for six mixtures and at different tem
... Show MoreThe research aims to build a list of digital citizenship axes and standards and indicators emanating from them, which should be included in the content of the computer textbook scheduled for second grade intermediate students in Iraq, and the analysis of the above mentioned book according to the same list using the descriptive analytical method ((method of content analysis)). The research community and its sample consisted of the content of the computer textbook scheduled for the second year intermediate students for the academic year 2018-2019, and the research tool was built in its initial form after reference to a set of specialized literature and previous studies that dealt with topics related to digital citizenship, and the authenticit
... Show MoreThe study aims to analyze computer textbooks content for preparatory stage according to the logical thinking. The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea during the analysis process. One of the content analysis tools which was designed based on mental processes employed during logical thinking has utilized to figure out the study results. The findings revealed that logical thinking skills formed (52%) in fourth preparatory textbook and (47%) in fifth preparatory textbook.
Ground penetrating radar (GPR) is one of the geophysical methods that utilize electromagnetic waves in the detection of subjects below the surface to record relative position and shape of archaeological features in 2D and 3D. GPR method was applied in detecting buried archaeological structure in study area in a location within the University of Baghdad. GPR with 3D interpretation managed to locate buried objects at the depth of (1m) . GPR Survey has been carried (12) vertical lines and (5) horizontal lines using frequency antenna (500) MHZ .
Statistical control charts are widely used in industry for process and measurement control . in this paper we study the use of markov chain approach in calculating the average run length (ARL) of cumulative sum (Cusum) control chart for defect the shifts in the mean of process , and exponentially weighted moving average (EWMA) control charts for defect the shifts for process mean and , the standard deviation . Also ,we used the EWMA charts based on the logarithm of the sample variance for monitoring a process standard deviation when the observations (products are selected from al_mamun factory ) are identically and independently distributed (iid) from normal distribution in continuous manufacturing .
Writing in English is one of the essential factors for successful EFL learning .Iraqi students at the preparatory schools encounter problems when using their background knowledge in handling subskills of writing(Burhan,2013:164).Therefore, this study aims to investigate the 4thyear preparatory school students’ problems in English composition writing, and find solutions to these pro
... Show MoreIn this study, we present a new steganography method depend on quantizing the perceptual color spaces bands. Four perceptual color spaces are used to test the new method which is HSL, HSV, Lab and Luv, where different algorithms to calculate the last two-color spaces are used. The results reveal the validity of this method as a steganoic method and analysis for the effects of quantization and stegano process on the quality of the cover image and the quality of the perceptual color spaces bands are presented.
Given a binary matrix, finding the maximum set of columns such that the resulting submatrix has the Consecutive Ones Property (C1P) is called the Consecutive Ones Submatrix (C1S) problem. There are solution approaches for it, but there is also a room for improvement. Moreover, most of the studies of the problem use exact solution methods. We propose an evolutionary approach to solve the problem. We also suggest a related problem to C1S, which is the Consecutive Blocks Minimization (CBM). The algorithm is then performed on real-world and randomly generated matrices of the set covering type.
One of the most interested problems that recently attracts many research investigations in Protein-protein interactions (PPI) networks is complex detection problem. Detecting natural divisions in such complex networks is proved to be extremely NP-hard problem wherein, recently, the field of Evolutionary Algorithms (EAs) reveals positive results. The contribution of this work is to introduce a heuristic operator, called protein-complex attraction and repulsion, which is especially tailored for the complex detection problem and to enable the EA to improve its detection ability. The proposed heuristic operator is designed to fine-grain the structure of a complex by dividing it into two more complexes, each being distinguished with a core pr
... Show More