A robust video-bitrate adaptive scheme at client-aspect plays a significant role in keeping a good quality of video streaming technology experience. Video quality affects the amount of time the video has turned off playing due to the unfilled buffer state. Therefore to maintain a video streaming continuously with smooth bandwidth fluctuation, a video buffer structure based on adapting the video bitrate is considered in this work. Initially, the video buffer structure is formulated as an optimal control-theoretic problem that combines both video bitrate and video buffer feedback signals. While protecting the video buffer occupancy from exceeding the limited operating level can provide continuous video streaming, it may also cause a video bitrate oscillation. So the video buffer structure is adjusted by adding two thresholds as operating points for overflow and underflow states to filter the impact of throughput fluctuation on video buffer occupancy level. Then a bandwidth prediction algorithm is proposed for enhancing the performance of video bitrate adaptation. This algorithm's work depends on the current video buffer level, video bitrate of the previous segment, and iterative throughput measurements to predict the best video bitrate for the next segment. Simulation results show that reserving a bandwidth margin is better in adapting the video bitrate under bandwidth variation and then reducing the risk of video playback freezing. Simulation results proved that the playback freezing happens two times: firstly, when there is no bandwidth margin used and secondly, when the bandwidth margin is high while smooth video bitrate is obtained with moderate value. The proposed scheme is compared with other two schemes such as smoothed throughput rate (STR) and Buffer Based Rate (BBR) in terms of prediction error, QoE preferences, buffer size, and startup delay time, then the proposed scheme outperforms these schemes in attaining smooth video bitrates and continuous video playback.
In this paper the experimentally obtained conditions for the fusion splicing with photonic crystal fibers (PCF) having large mode areas were reported. The physical mechanism of the splice loss and the microhole collapse property of photonic crystal fiber (PCF) were studied. By controlling the arc-power and the arc-time of a conventional electric arc fusion splicer (FSM-60S), the minimum loss of splicing for fusion two conventional single mode fibers (SMF-28) was (0.00dB), which has similar mode field diameter. For splicing PCF (LMA-10) with a conventional single mode fiber (SMF-28), the loss was increased due to the mode field mismatch.
Polyaromatic hydrocarbons (PAHs) are a group of aromatic compounds that contain at least two rings. These compounds are found naturally in petroleum products and are considered the most prevalent pollutants in the environment. The lack of microorganism capable of degrading some PAHs led to their accumulation in the environment which usually causes major health problems as many of these compounds are known carcinogens. Xanthene is one of the small PAHs which has three rings. Many xanthene derivatives are useful dyes that are used for dyeing wood and cosmetic articles. However, several studies have illustrated that these compounds have toxic and carcinogenic effects. The first step of the bacterial degradation of xanthene is conducted by d
... Show MoreIt is widely accepted that early diagnosis of Alzheimer's disease (AD) makes it possible for patients to gain access to appropriate health care services and would facilitate the development of new therapies. AD starts many years before its clinical manifestations and a biomarker that provides a measure of changes in the brain in this period would be useful for early diagnosis of AD. Given the rapid increase in the number of older people suffering from AD, there is a need for an accurate, low-cost and easy to use biomarkers that could be used to detect AD in its early stages. Potentially, the electroencephalogram (EEG) can play a vital role in this but at present, no reliable EEG biomarker exists for early diagnosis of AD. The gradual s
... Show MoreNowadays, cloud computing has attracted the attention of large companies due to its high potential, flexibility, and profitability in providing multi-sources of hardware and software to serve the connected users. Given the scale of modern data centers and the dynamic nature of their resource provisioning, we need effective scheduling techniques to manage these resources while satisfying both the cloud providers and cloud users goals. Task scheduling in cloud computing is considered as NP-hard problem which cannot be easily solved by classical optimization methods. Thus, both heuristic and meta-heuristic techniques have been utilized to provide optimal or near-optimal solutions within an acceptable time frame for such problems. In th
... Show MoreMass transfer correlations for iron rotating cylinder electrode in chloride/sulphate solution, under isothermal and
controlled heat transfer conditions, were derived. Limiting current density values for the oxygen reduction reaction from
potentiostatic experiments at different bulk temperatures and various turbulent flow rates, under isothermal and heat
transfer conditions, were used for such derivation. The corelations were analogous to that obtained by Eisenberg et all
and other workers.
Each project management system aims to complete the project within its identified objectives: budget, time, and quality. It is achieving the project within the defined deadline that required careful scheduling, that be attained early. Due to the nature of unique repetitive construction projects, time contingency and project uncertainty are necessary for accurate scheduling. It should be integrated and flexible to accommodate the changes without adversely affecting the construction project’s total completion time. Repetitive planning and scheduling methods are more effective and essential. However, they need continuous development because of the evolution of execution methods, essent