A robust video-bitrate adaptive scheme at client-aspect plays a significant role in keeping a good quality of video streaming technology experience. Video quality affects the amount of time the video has turned off playing due to the unfilled buffer state. Therefore to maintain a video streaming continuously with smooth bandwidth fluctuation, a video buffer structure based on adapting the video bitrate is considered in this work. Initially, the video buffer structure is formulated as an optimal control-theoretic problem that combines both video bitrate and video buffer feedback signals. While protecting the video buffer occupancy from exceeding the limited operating level can provide continuous video streaming, it may also cause a video bitrate oscillation. So the video buffer structure is adjusted by adding two thresholds as operating points for overflow and underflow states to filter the impact of throughput fluctuation on video buffer occupancy level. Then a bandwidth prediction algorithm is proposed for enhancing the performance of video bitrate adaptation. This algorithm's work depends on the current video buffer level, video bitrate of the previous segment, and iterative throughput measurements to predict the best video bitrate for the next segment. Simulation results show that reserving a bandwidth margin is better in adapting the video bitrate under bandwidth variation and then reducing the risk of video playback freezing. Simulation results proved that the playback freezing happens two times: firstly, when there is no bandwidth margin used and secondly, when the bandwidth margin is high while smooth video bitrate is obtained with moderate value. The proposed scheme is compared with other two schemes such as smoothed throughput rate (STR) and Buffer Based Rate (BBR) in terms of prediction error, QoE preferences, buffer size, and startup delay time, then the proposed scheme outperforms these schemes in attaining smooth video bitrates and continuous video playback.
Krawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show MoreIn many video and image processing applications, the frames are partitioned into blocks, which are extracted and processed sequentially. In this paper, we propose a fast algorithm for calculation of features of overlapping image blocks. We assume the features are projections of the block on separable 2D basis functions (usually orthogonal polynomials) where we benefit from the symmetry with respect to spatial variables. The main idea is based on a construction of auxiliary matrices that virtually extends the original image and makes it possible to avoid a time-consuming computation in loops. These matrices can be pre-calculated, stored and used repeatedly since they are independent of the image itself. We validated experimentally th
... Show MoreRegression testing being expensive, requires optimization notion. Typically, the optimization of test cases results in selecting a reduced set or subset of test cases or prioritizing the test cases to detect potential faults at an earlier phase. Many former studies revealed the heuristic-dependent mechanism to attain optimality while reducing or prioritizing test cases. Nevertheless, those studies were deprived of systematic procedures to manage tied test cases issue. Moreover, evolutionary algorithms such as the genetic process often help in depleting test cases, together with a concurrent decrease in computational runtime. However, when examining the fault detection capacity along with other parameters, is required, the method falls sh
... Show MoreRecently Tobit Quantile Regression(TQR) has emerged as an important tool in statistical analysis . in order to improve the parameter estimation in (TQR) we proposed Bayesian hierarchical model with double adaptive elastic net technique and Bayesian hierarchical model with adaptive ridge regression technique .
in double adaptive elastic net technique we assume different penalization parameters for penalization different regression coefficients in both parameters λ1and λ2 , also in adaptive ridge regression technique we assume different penalization parameters for penalization different regression coefficients i
... Show MoreExchange of information through the channels of communication can be unsafe. Communication media are not safe to send sensitive information so it is necessary to provide the protection of information from disclosure to unauthorized persons. This research presented the method to information security is done through information hiding into the cover image using a least significant bit (LSB) technique, where a text file is encrypted using a secret sharing scheme. Then, generating positions to hiding information in a random manner of cover image, which is difficult to predict hiding in the image-by-image analysis or statistical analyzes. Where it provides two levels of information security through encryption of a text file using the secret sha
... Show MoreSome problems want to be solved in image compression to make the process workable and more efficient. Much work had been done in the field of lossy image compression based on wavelet and Discrete Cosine Transform (DCT). In this paper, an efficient image compression scheme is proposed, based on a common encoding transform scheme; It consists of the following steps: 1) bi-orthogonal (tab 9/7) wavelet transform to split the image data into sub-bands, 2) DCT to de-correlate the data, 3) the combined transform stage's output is subjected to scalar quantization before being mapped to positive, 4) and LZW encoding to produce the compressed data. The peak signal-to-noise (PSNR), compression ratio (CR), and compression gain (CG) measures were used t
... Show MoreWe explore the transform coefficients of fractal and exploit new method to improve the compression capabilities of these schemes. In most of the standard encoder/ decoder systems the quantization/ de-quantization managed as a separate step, here we introduce new way (method) to work (managed) simultaneously. Additional compression is achieved by this method with high image quality as you will see later.
Abstract
Pneumatic processes sequence (PPS) is used widely in industrial applications. It is common to do a predetermined PPS to achieve a specific larger task within the industrial application like the PPS achieved by the pick and place industrial robot arm. This sequence may require change depending on changing the required task and usually this requires the programmer intervention to change the sequence’ sprogram, which is costly and may take long time. In this research a PLC-based PPS control system is designed and implemented, in which the PPS is programmed by demonstration. The PPS could be changed by demonstrating the new required sequence via the user by following simple series of manual steps without h
... Show MoreStenography is the art of hiding the very presence of communication by embedding secret message into innocuous looking cover document, such as digital image, videos, sound files, and other computer files that contain perceptually irrelevant or redundant information as covers or carriers to hide secret messages.
In this paper, a new Least Significant Bit (LSB) nonsequential embedding technique in wave audio files is introduced. To support the immunity of proposed hiding system, and in order to recover some weak aspect inherent with the pure implementation of stego-systems, some auxiliary processes were suggested and investigated including the use of hidden text jumping process and stream ciphering algorithm. Besides, the suggested
... Show More