A robust video-bitrate adaptive scheme at client-aspect plays a significant role in keeping a good quality of video streaming technology experience. Video quality affects the amount of time the video has turned off playing due to the unfilled buffer state. Therefore to maintain a video streaming continuously with smooth bandwidth fluctuation, a video buffer structure based on adapting the video bitrate is considered in this work. Initially, the video buffer structure is formulated as an optimal control-theoretic problem that combines both video bitrate and video buffer feedback signals. While protecting the video buffer occupancy from exceeding the limited operating level can provide continuous video streaming, it may also cause a video bitrate oscillation. So the video buffer structure is adjusted by adding two thresholds as operating points for overflow and underflow states to filter the impact of throughput fluctuation on video buffer occupancy level. Then a bandwidth prediction algorithm is proposed for enhancing the performance of video bitrate adaptation. This algorithm's work depends on the current video buffer level, video bitrate of the previous segment, and iterative throughput measurements to predict the best video bitrate for the next segment. Simulation results show that reserving a bandwidth margin is better in adapting the video bitrate under bandwidth variation and then reducing the risk of video playback freezing. Simulation results proved that the playback freezing happens two times: firstly, when there is no bandwidth margin used and secondly, when the bandwidth margin is high while smooth video bitrate is obtained with moderate value. The proposed scheme is compared with other two schemes such as smoothed throughput rate (STR) and Buffer Based Rate (BBR) in terms of prediction error, QoE preferences, buffer size, and startup delay time, then the proposed scheme outperforms these schemes in attaining smooth video bitrates and continuous video playback.
Dust samples have been collected from three areas in Baghdad during dust storm occurred in 18th of June 2009 to characterize elemental particle size and composition by different techniques. The x-ray diffraction detected six minerals those are calcite, and quartz, present as a major components, dolomite, kaolinite, gypsum and plagioclase present as miner components .EDX detected some normal elements presented in local soil except traces of lead , nickel, and chromium. The particle size analysis by a set of sieves have revealed that the majority particle distribution was between (32 and 45)μm . To isolate the aerosol size, PM10 buoyancy method of powder in water showed a signifying amounts of particulate size .Scheerer’s method was app
... Show MoreWind turbine (WT) is now a major renewable energy resource used in the modern world. One of the most significant technologies that use the wind speed (WS) to generate electric power is the horizontal-axis wind turbine. In order to enhance the output power over the rated WS, the blade pitch angle (BPA) is controlled and adjusted in WT. This paper proposes and compares three different controllers of BPA for a 500-kw WT. A PID controller (PIDC), a fuzzy logic controller (FLC) based on Mamdani and Sugeno fuzzy inference systems (FIS), and a hybrid fuzzy-PID controller (HFPIDC) have been applied and compared. Furthermore, Genetic Algorithm (GA) and Particle swarm optimization (PSO) have been applied and compared in order to identify the optimal
... Show MoreIn this research, the methods of Kernel estimator (nonparametric density estimator) were relied upon in estimating the two-response logistic regression, where the comparison was used between the method of Nadaraya-Watson and the method of Local Scoring algorithm, and optimal Smoothing parameter λ was estimated by the methods of Cross-validation and generalized Cross-validation, bandwidth optimal λ has a clear effect in the estimation process. It also has a key role in smoothing the curve as it approaches the real curve, and the goal of using the Kernel estimator is to modify the observations so that we can obtain estimators with characteristics close to the properties of real parameters, and based on medical data for patients with chro
... Show MoreThe present study aims to reveal the extent of the influence of the acquired organizational immune through its dimensions (organizational vaccination, organizational learning, organizational memory, and benchmarking) in the application of knowledge management strategies in its two dimensions (codification strategy, personalization strategy) as well as clarifying that influential relationship between the study variables Because of its importance in reducing resistance to change by responding to the requirements of the environment. A set of main and sub-hypotheses emerged from the study, which was formulated in view of the hypothesis scheme of the study, and i
... Show MoreThis research aims to identify the role that forensic accounting plays on the transparency and quality of the financial statements in trade bank of Iraq and the Gulf Commercial Bank in Babylon. This research came to address the problem that most financial institutions suffer from, which is represented by the lack of transparency and the quality of the financial statements issued in a manner. Annual also the manipulation and fraud in the financial data, which causes a big gap between that institutions and organizational stakeholders. According to the implementation of the research hypothesis and the objectives of the research, a questionnaire was prepared consisting of three axes, the first axis dealing with the demographic distri
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
The major objective of this study is to establish a network of Ground Control Points-GCPs which can use it as a reference for any engineering project. Total Station (type: Nikon Nivo 5.C), Optical Level and Garmin Navigator GPS were used to perform traversing. Traversing measurement was achieved by using nine points covered the selected area irregularly. Near Civil Engineering Department at Baghdad University Al-jadiriya, an attempt has been made to assess the accuracy of GPS by comparing the data obtained from the Total Station. The average error of this method is 3.326 m with the highest coefficient of determination (R2) is 0.077 m observed in Northing. While in
A new series polymers was synthesized from reaction starting material Bisacodyl A or [(2-Pyridinylmethylene) di-4, 1-phenylene di acetate] with hydrogen bromide, then the products were polymerized by addition polymerization from used adipoyl and glutaroyl chloride. The structure of these compounds was characterized by FT-IR, melting points, TLC, X-Ray, DSC and 1H-NMR for starting material. These compounds were also screened for their antibacterial activists?
النظام السياسي اليمني : دراسة في المتغيرات الداخلية