Information pollution is regarded as a big problem facing journalists working in the editing section, whereby journalistic materials face such pollution through their way across the editing pyramid. This research is an attempt to define the concept of journalistic information pollution, and what are the causes and sources of this pollution. The research applied the descriptive research method to achieve its objectives. A questionnaire was used to collect data. The findings indicate that journalists are aware of the existence of information pollution in journalism, and this pollution has its causes and resources.
This study was aimed to find and test biological methods for reducing the aggregation of plastics such as PS in the environment and study the ability of Greater Wax worms larvae (Galleria mellonella) to eat PS that similar in the its structure to beeswax .Weight loss, morphology changes ,FTIR spectroscopy and GC-mass analysis were performed which showed changes in chemical properties of the PS due to degradation. In this study the percentage of weight loss was 33% in the PS treated with G. mellonella. FTIR of PS frass showed the disappearance of aromatic cycle band that was found in the origin PS at region more than 3000 cm-1. Also The PS frass samples from wax worms larvae revealed the creation of a new O-H stretching alcohol
... Show MoreThis research deals with a shrinking method concerned with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained
... Show MoreA new distribution, the Epsilon Skew Gamma (ESΓ ) distribution, which was first introduced by Abdulah [1], is used on a near Gamma data. We first redefine the ESΓ distribution, its properties, and characteristics, and then we estimate its parameters using the maximum likelihood and moment estimators. We finally use these estimators to fit the data with the ESΓ distribution
In this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Hueckel edge detector study using binary step edge image is presented. The standard algorithm that Hueckel presented, in his paper without any alteration is adopted. This paper studies a fully analysis for the algorithm efficiency, time consuming and the expected results with slide window size and edge direction. An analysis for its behavior with the changing of the slide window size (disk size) is presented. The best result is acquired when the window size equals to four pixel.
Human health can be negatively impacted by exposure to loud noise, which can harm the auditory system. Traffic noise is the leading cause of noise pollution. This paper studies the problem of noise pollution on the roads in Baghdad, Iraq. Due to the increase in vehicle numbers and road network modifications in Baghdad, noise levels became a serious topic to be studied. The aim of the paper was thus to study traffic noise levels and the effect of the traffic stream on noise levels and to formulate a prediction model that identified the guidelines used for designing or developing future roads in the city. Then, the noise levels were measured based on five variables: the functional classification of roads, traffic flow, vehicle speed,
... Show MoreThe compound Fe0.5CoxMg0.95-xO where (x= 0.025, 0.05, 0.075, 0.1) was prepared via the sol-gel technique. The crystalline nature of magnesium oxide was studied by X-ray powder diffraction (XRD) analysis, and the size of the sample crystals, ranging between (16.91-19.62nm), increased, while the lattice constant within the band (0.5337-0.4738 nm) decreased with increasing the cobalt concentration. The morphology of the specimens was studied by scanning electron microscopy (SEM) which shows images forming spherical granules in addition to the presence of interconnected chips. The presence of the elements involved in the super
Diabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show More