Predicting the network traffic of web pages is one of the areas that has increased focus in recent years. Modeling traffic helps find strategies for distributing network loads, identifying user behaviors and malicious traffic, and predicting future trends. Many statistical and intelligent methods have been studied to predict web traffic using time series of network traffic. In this paper, the use of machine learning algorithms to model Wikipedia traffic using Google's time series dataset is studied. Two data sets were used for time series, data generalization, building a set of machine learning models (XGboost, Logistic Regression, Linear Regression, and Random Forest), and comparing the performance of the models using (SMAPE) and (MAPE). The results showed the possibility of modeling the network traffic time series and that the performance of the linear regression model is the best compared to the rest of the models for both series.
Abstract
It considers training programs is an important process contributing to provide employees with the skills required to do their jobs efficiently and effectively, so it should be concerned with and the focus of all government our organizations, and perhaps the most important reasons that I was invited to select the subject (evaluation of training programs directed toward the diagnosis of the phenomenon of financial and administrative corruption) It is the importance of those programs working in the regulatory institutions General and the Office of Inspector General of Finance and the Ministry particularly for employees because of their role in the development of their skills and their experience and their beha
... Show MorePurpose: The purpose of this study was to clarify the basic dimensions, which seeks to indestructible scenarios practices within the organization, as a final result from the use of this philosophy.
Methodology: The methodology that focuses adoption researchers to study survey of major literature that dealt with this subject in order to provide a conceptual theoretical conception of scenarios theory .
The most prominent findings: The only successful formulation of scenarios, when you reach the decision-maker's mind wa takes aim to form a correct mental models, which appear in the expansion of Perception managers, and adopted as the basis of the decisions taken. The strength l
... Show MoreThe chemical properties of chemical compounds and their molecular structures are intimately connected. Topological indices are numerical values associated with chemical molecular graphs that help in understanding the physicochemical properties, chemical reactivity and biological activity of a chemical compound. This study obtains some topological properties of second and third dominating David derived (DDD) networks and computes several K Banhatti polynomial of second and third type of DDD.
Chemical compounds, characteristics, and molecular structures are inevitably connected. Topological indices are numerical values connected with chemical molecular graphs that contribute to understanding a chemical compounds physical qualities, chemical reactivity, and biological activity. In this study, we have obtained some topological properties of the first dominating David derived (DDD) networks and computed several K-Banhatti polynomials of the first type of DDD.
In recent decades, tremendous success has been achieved in the advancement of chemical admixtures for Portland cement concrete. Most efforts have concentrated on improving the properties of concrete and studying the factors that influence on these properties. Since the compressive strength is considered a valuable property and is invariably a vital element of the structural design, especially high early strength development which can be provide more benefits in concrete production, such as reducing construction time and labor and saving the formwork and energy. As a matter of fact, it is influenced as a most properties of concrete by several factors including water-cement ratio, cement type and curing methods employed.
Because of acce
Researchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreRheological instrument is one of the basic analytical measurements for diagnosing the properties of polymers fluids to be used in any industry. In this research polycarbonate was chosen because of its importance in many areas and possesses several distinct properties.
Two kinds of rheometers devices were used at different range of temperatures from 220 ˚C-300 ˚C to characterize the rheological technique of melted polycarbonate (Makrolon 2805) by a combination of different investigating techniques. We compared the results of the linear (oscillatory) method with the non-linear (steady-state) method; the former method provided the storage and the loss modulus of melted polycarbonate, and presented the Cox-Merz model as well. One of the
The aim of this research is to determine the most important and main factors that lead to Preeclampsia. It is also about finding suitable solutions to eradicate these factors and avoid them in order to prevent getting Preeclampsia. To achieve this, a case study sample of (40) patients from Medical City - Oncology Teaching Hospital was used to collect data by a questionnaire which contained (17) reasons to be investigated. The statistical package (SPSS) was used to compare the results of the data analysis through two methods (Radial Bases Function Network) and (Factorial Analysis). Important results were obtained, the two methods determined the same factors that could represent the direct reason which causes Preecla
... Show MoreIn this paper, we design a fuzzy neural network to solve fuzzy singularly perturbed Volterra integro-differential equation by using a High Performance Training Algorithm such as the Levenberge-Marqaurdt (TrianLM) and the sigmoid function of the hidden units which is the hyperbolic tangent activation function. A fuzzy trial solution to fuzzy singularly perturbed Volterra integro-differential equation is written as a sum of two components. The first component meets the fuzzy requirements, however, it does not have any fuzzy adjustable parameters. The second component is a feed-forward fuzzy neural network with fuzzy adjustable parameters. The proposed method is compared with the analytical solutions. We find that the proposed meth
... Show MoreThe utilization of artificial intelligence techniques has garnered significant interest in recent research due to their pivotal role in enhancing the quality of educational offerings. This study investigated the impact of employing artificial intelligence techniques on improving the quality of educational services, as perceived by students enrolled in the College of Pharmacy at the University of Baghdad. The study sample comprised 379 male and female students. A descriptive-analytical approach was used, with a questionnaire as the primary tool for data collection. The findings indicated that the application of artificial intelligence methods was highly effective, and the educational services provided to students were of exceptional quality.
... Show More