The most popular medium that being used by people on the internet nowadays is video streaming. Nevertheless, streaming a video consumes much of the internet traffics. The massive quantity of internet usage goes for video streaming that disburses nearly 70% of the internet. Some constraints of interactive media might be detached; such as augmented bandwidth usage and lateness. The need for real-time transmission of video streaming while live leads to employing of Fog computing technologies which is an intermediary layer between the cloud and end user. The latter technology has been introduced to alleviate those problems by providing high real-time response and computational resources near to the client at the network boundary. The present research paper proposes priority weighted round robin (PWRR) algorithm for streaming operations scheduling in the fog architecture. This will give preemptive for streaming live video request to be delivered in a very short response time and real-time communication. The results of experimenting the PWRR in the proposed architecture display a minimize latency and good quality of live video requests which has been achieved with bandwidth changes as well as meeting all other clients requests at the same time
The COVID-19 pandemic has profoundly affected the healthcare sector and the productivity of medical staff and doctors. This study employs machine learning to analyze the post-COVID-19 impact on the productivity of medical staff and doctors across various specialties. A cross-sectional study was conducted on 960 participants from different specialties between June 1, 2022, and April 5, 2023. The study collected demographic data, including age, gender, and socioeconomic status, as well as information on participants' sleeping habits and any COVID-19 complications they experienced. The findings indicate a significant decline in the productivity of medical staff and doctors, with an average reduction of 23% during the post-COVID-19 period. T
... Show MoreThe purpose of this work was to study the effects of the Nd:YAG laser on exposed dentinal
tubules of human extracted teeth using a scanning electron microscope (SEM). Eighty 2.5mm-thick
slices were cut at the cementoenamel junction from 20 extracted human teeth with an electric saw. A
diamond bur was used to remove the cementum layer to expose the dentinal tubules. Each slice was
sectioned into four equal quadrants and the specimens were randomly divided into four groups (A to D ).
Groups B to D were lased for 2 mins using an Nd:YAG laser at 6 pulses per second at energy outputs of
80 , 100 and 120 mJ. Group A served as control. Under SEM observation, nonlased specimens showed
numerous exposed dentinal tubules. SEM o
The estimation of the parameters of Two Parameters Gamma Distribution in case of missing data has been made by using two important methods: the Maximum Likelihood Method and the Shrinkage Method. The former one consists of three methods to solve the MLE non-linear equation by which the estimators of the maximum likelihood can be obtained: Newton-Raphson, Thom and Sinha methods. Thom and Sinha methods are developed by the researcher to be suitable in case of missing data. Furthermore, the Bowman, Shenton and Lam Method, which depends on the Three Parameters Gamma Distribution to get the maximum likelihood estimators, has been developed. A comparison has been made between the methods in the experimental aspect to find the best meth
... Show MoreThe individual average income is considered one of the most used criteria for the distinguishing between the developed and the developing countries, for this reason the efforts of economic development has been construed on increasing the average national income, the investment expenditures is considered one of the basic foundations for economic development operation which lead to the expanding the prodection power of the economy, and increasing the level of national income in an averages greaten than the primary expenditures due to the work and interaction between the multiplier and the accelerator. But the ability of the economic sectors in the generation of national income as a result of the primary expenditures is different fr
... Show MoreThe no parity problem causes determining is the most interesting case by doctors and researchers in this filed, because it helps them to pre-discovering of it, from this point the important of this paper is came, which tries to determine the priority causes and its fluency, thus it helps doctors and researchers to determine the problem and it’s fluency of increase or decrease the active sperm which fluencies of peregrinating. We use the censored regression (Tobit) model to analyze the data that contains 150 observations may by useful to whom it concern.
... Show More
This research work involves the preparation of nano activated carbonand macro activated carbon from corn seeds with a various mixing ratio ofpotassium hydroxide (1:0, 1:0.2, 1:0.4, 1:0.6, 1:0.8 and 1:1) % using thermaland micro radiation carbonization to identify the best mixing ratio. At studyto confirmed that the efficiency and effectiveness of the prepared of activatedcarbon samples increase when ratio potassium hydroxide increase with athermal and micro radiation carbonization was used. The study of samplesexternal surface area was performed via studying the adsorption of methyleneblue from their aqueous solution, also measured the internal surface area wasperformed via studying the adsorption of iodine from their aqueous solution.Measu
... Show MoreThis research aims to solve the problem of selection using clustering algorithm, in this research optimal portfolio is formation using the single index model, and the real data are consisting from the stocks Iraqi Stock Exchange in the period 1/1/2007 to 31/12/2019. because the data series have missing values ,we used the two-stage missing value compensation method, the knowledge gap was inability the portfolio models to reduce The estimation error , inaccuracy of the cut-off rate and the Treynor ratio combine stocks into the portfolio that caused to decline in their performance, all these problems required employing clustering technic to data mining and regrouping it within clusters with similar characteristics to outperform the portfolio
... Show More