Most Internet-tomography problems such as shared congestion detection depend on network measurements. Usually, such measurements are carried out in multiple locations inside the network and relied on local clocks. These clocks usually skewed with time making these measurements unsynchronized and thereby degrading the performance of most techniques. Recently, shared congestion detection has become an important issue in many computer networked applications such as multimedia streaming and
peer-to-peer file sharing. One of the most powerful techniques that employed in literature is based on Discrete Wavelet Transform (DWT) with cross-correlation operation to determine the state of the congestion. Wavelet transform is used as a de-noising tool to reduce the effects of both clock skew and queuing delay fluctuations on the decision of congestion type. Since, classical Discrete Wavelet Transform (DWT) is not shift-invariant transform which is a very useful property particularly in signal de-noising problems. Therefore, another transform called Stationary Wavelet Transform (SWT) that possesses shiftinvariant property is suggested and used instead of DWT. The modified technique exhibits a better performance in terms of the time required to correctly detect the state of congestion especially with the existence of clock skew problem. The suggested technique is tested using simulations under different
environments.
In this paper, the Normality set will be investigated. Then, the study highlights some concepts properties and important results. In addition, it will prove that every operator with normality set has non trivial invariant subspace of .
There is a great deal of systems dealing with image processing that are being used and developed on a daily basis. Those systems need the deployment of some basic operations such as detecting the Regions of Interest and matching those regions, in addition to the description of their properties. Those operations play a significant role in decision making which is necessary for the next operations depending on the assigned task. In order to accomplish those tasks, various algorithms have been introduced throughout years. One of the most popular algorithms is the Scale Invariant Feature Transform (SIFT). The efficiency of this algorithm is its performance in the process of detection and property description, and that is due to the fact that
... Show MoreThe cross section evaluation for (α,n) reaction was calculated according to the available International Atomic Energy Agency (IAEA) and other experimental published data . These cross section are the most recent data , while the well known international libraries like ENDF , JENDL , JEFF , etc. We considered an energy range from threshold to 25 M eV in interval (1 MeV). The average weighted cross sections for all available experimental and theoretical(JENDL) data and for all the considered isotopes was calculated . The cross section of the element is then calculated according to the cross sections of the isotopes of that element taking into account their abundance . A mathematical representative equation for each of the element
... Show MoreIn this paper, an efficient method for compressing color image is presented. It allows progressive transmission and zooming of the image without need to extra storage. The proposed method is going to be accomplished using cubic Bezier surface (CBI) representation on wide area of images in order to prune the image component that shows large scale variation. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, bi-orthogonal wavelet transform is applied to decompose the residue component. Both scalar quantization and quad tree coding steps are applied on the produced wavelet sub bands. Finally, adaptive shift coding is applied to handle the remaining statistical redundancy and attain e
... Show MoreThe planning, designing, construction of excavations and foundations in soft to very soft clay soils are always difficult. They are problematic soil that caused trouble for the structures built on them because of the low shear strength, high water content, and high compressibility. This work investigates the geotechnical behavior of soft clay by using tyre ash material burnt in air. The investigation contains the following tests: physical tests, chemical tests, consolidation test, Compaction tests, shear test, California Bearing Ratio test CBR, and model tests. These tests were done on soil samples prepared from soft clay soil; tyre ash was used in four percentages (2, 4, 6, and 8%). The results of the tests were; The soil samples which
... Show MoreWhen embankment is constructed on very soft soil, special construction methods are adopted. One of the techniques is a piled embankment. Piled (stone columns) embankments provide an economic and effective solution to the problem of constructing embankments over soft soils. This method can reduce settlements, construction time and cost. Stone columns provide an effective improvement method for soft soils under light structures such as rail or road embankments. The present work investigates the behavior of the embankment models resting on soft soil reinforced with stone columns. Model tests were performed with different spacing distances between stone columns and two lengths to diameter ratios of the stone columns, in addition to different
... Show MoreThe estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreThe process of accurate localization of the basic components of human faces (i.e., eyebrows, eyes, nose, mouth, etc.) from images is an important step in face processing techniques like face tracking, facial expression recognition or face recognition. However, it is a challenging task due to the variations in scale, orientation, pose, facial expressions, partial occlusions and lighting conditions. In the current paper, a scheme includes the method of three-hierarchal stages for facial components extraction is presented; it works regardless of illumination variance. Adaptive linear contrast enhancement methods like gamma correction and contrast stretching are used to simulate the variance in light condition among images. As testing material
... Show MoreIn this study, an efficient compression system is introduced, it is based on using wavelet transform and two types of 3Dimension (3D) surface representations (i.e., Cubic Bezier Interpolation (CBI)) and 1 st order polynomial approximation. Each one is applied on different scales of the image; CBI is applied on the wide area of the image in order to prune the image components that show large scale variation, while the 1 st order polynomial is applied on the small area of residue component (i.e., after subtracting the cubic Bezier from the image) in order to prune the local smoothing components and getting better compression gain. Then, the produced cubic Bezier surface is subtracted from the image signal to get the residue component. Then, t
... Show More