Predicting the network traffic of web pages is one of the areas that has increased focus in recent years. Modeling traffic helps find strategies for distributing network loads, identifying user behaviors and malicious traffic, and predicting future trends. Many statistical and intelligent methods have been studied to predict web traffic using time series of network traffic. In this paper, the use of machine learning algorithms to model Wikipedia traffic using Google's time series dataset is studied. Two data sets were used for time series, data generalization, building a set of machine learning models (XGboost, Logistic Regression, Linear Regression, and Random Forest), and comparing the performance of the models using (SMAPE) and (MAPE). The results showed the possibility of modeling the network traffic time series and that the performance of the linear regression model is the best compared to the rest of the models for both series.
Vegetable sold oils (fat of the plant) were studied spectroscopy to determine its properties when it heating more than once, in this study the number of heating was 20 times and show the fat of the plant has been greatly affected by the heat. Aluminum and stainless steel were adopted for heating purposes. It turns out that the quality of the heating pot affects the spectrum specification. Changing the registered spectrum specifications for vegetable fat means a change in the characteristics of the same substance, which makes cooking use a second time with risks. Results were re-examined after six months showing significant risk of storage after heating to 300 oC the study proved that the structure of heating the vegetable oils changes an
... Show MoreThis research sought to present a concept of cross-sectional data models, A crucial double data to take the impact of the change in time and obtained from the measured phenomenon of repeated observations in different time periods, Where the models of the panel data were defined by different types of fixed , random and mixed, and Comparing them by studying and analyzing the mathematical relationship between the influence of time with a set of basic variables Which are the main axes on which the research is based and is represented by the monthly revenue of the working individual and the profits it generates, which represents the variable response And its relationship to a set of explanatory variables represented by the
... Show MoreThe main aim of this research paper is investigating the effectiveness and validity of Meso-Scale Approach (MSA) as a modern technique for the modeling of plain concrete beams. Simply supported plain concrete beam was subjected to two-point loading to detect the response in flexural. Experimentally, a concrete mix was designed and prepared to produce three similar standard concrete prisms for flexural testing. The coarse aggregate used in this mix was crushed aggregate. Numerical Finite Element Analysis (FEA) was conducted on the same concrete beam using the meso-scale modeling. The numerical model was constructed to be a bi-phasic material consisting of cement mortar and coarse aggregate. The interface between the two c
... Show MoreImage Fusion Using A Convolutional Neural Network
This paper explores VANET topics: architecture, characteristics, security, routing protocols, applications, simulators, and 5G integration. We update, edit, and summarize some of the published data as we analyze each notion. For ease of comprehension and clarity, we give part of the data as tables and figures. This survey also raises issues for potential future research topics, such as how to integrate VANET with a 5G cellular network and how to use trust mechanisms to enhance security, scalability, effectiveness, and other VANET features and services. In short, this review may aid academics and developers in choosing the key VANET characteristics for their objectives in a single document.
The main aim of image compression is to reduce the its size to be able for transforming and storage, therefore many methods appeared to compress the image, one of these methods is "Multilayer Perceptron ". Multilayer Perceptron (MLP) method which is artificial neural network based on the Back-Propagation algorithm for compressing the image. In case this algorithm depends upon the number of neurons in the hidden layer only the above mentioned will not be quite enough to reach the desired results, then we have to take into consideration the standards which the compression process depend on to get the best results. We have trained a group of TIFF images with the size of (256*256) in our research, compressed them by using MLP for each
... Show MoreDesign sampling plan was and still one of most importance subjects because it give lowest cost comparing with others, time live statistical distribution should be known to give best estimators for parameters of sampling plan and get best sampling plan.
Research dell with design sampling plan when live time distribution follow Logistic distribution with () as location and shape parameters, using these information can help us getting (number of groups, sample size) associated with reject or accept the Lot
Experimental results for simulated data shows the least number of groups and sample size needs to reject or accept the Lot with certain probability of
... Show More