Moisture damage is one of the most significant troubles that destroy asphaltic pavement and reduces road serviceability. Recently, academics have noticed a trend to utilize fibers to enhance the efficiency of asphalt pavement. This research explores the effect of low-cost ceramic fiber, which has high tensile strength and a very high thermal insulation coefficient, on the asphalt mixture's characteristics by adding three different proportions (0.75%, 1.5%, and 2.25%). The Marshall test and the Tensile Strength Ratio Test (TSR) were utilized to describe the impact of ceramic fiber on the characteristics of Marshall and the moisture susceptibility of the hot mix asphalt mixture. The Field Emission Scanning Electron Microscopy (FE-SEM) analysis was used to investigate ceramic fibers' microscopic structure and clarify the mechanics of their improved behavior and their distribution within the asphalt concrete mixture. The results showed that the incorporation of ceramic fibers improved the Marshall properties and the asphalt mixture's susceptibility to moisture damage with an optimum fiber content equal to 1.5%, where Marshall stability increased by 39.04%, and the TSR increased by 11.06% at this content compared with the control asphalt mixture.
Abstract
Metal cutting processes still represent the largest class of manufacturing operations. Turning is the most commonly employed material removal process. This research focuses on analysis of the thermal field of the oblique machining process. Finite element method (FEM) software DEFORM 3D V10.2 was used together with experimental work carried out using infrared image equipment, which include both hardware and software simulations. The thermal experiments are conducted with AA6063-T6, using different tool obliquity, cutting speeds and feed rates. The results show that the temperature relatively decreased when tool obliquity increases at different cutting speeds and feed rates, also it
... Show MoreImage classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreIn this paper we estimate the coefficients and scale parameter in linear regression model depending on the residuals are of type 1 of extreme value distribution for the largest values . This can be regard as an improvement for the studies with the smallest values . We study two estimation methods ( OLS & MLE ) where we resort to Newton – Raphson (NR) and Fisher Scoring methods to get MLE estimate because the difficulty of using the usual approach with MLE . The relative efficiency criterion is considered beside to the statistical inference procedures for the extreme value regression model of type 1 for largest values . Confidence interval , hypothesis testing for both scale parameter and regression coefficients
... Show MoreCryptosporidiosis is mainly cause a persistent diarrhea in immune compromised patients, BALB/c mice have been suppressed by dexamethasone, tissue Th1, Th2 and Th17 cytokines concentrations in the ileum were significantly diminished in both infected and immunosuppressed mice. Level of IFN-g, TNF-a, IL-12, IL-6, IL-17A was increased in level, IL-4 didn’t increases, in both ileal and spleen tissue. Levels of above cytokines were examined in spleen in order to follow the proliferation of CD4+ T-cell during C. parvum infection.
This paper aims at the fact that most organizations today suffer from a waste of time, effort, and cost, and they have difficulty in achieving the best performance situations and compete strongly. The researcher distributed 108 questionnaires as a statistical analyzable sample society where the sample intentionally consists of general managers, department head, and division head. The questionnaire was formulated according to the Likert scale. The use of personal interviews and observations are additional tools for data collection and a number of statistical methods is used for data analysis such as simple regression and correlation coefficient (Pearson). One of the most prominent conclusions is that the company has adequate and c
... Show MoreIn this paper, the Magnetohydrodynamic (MHD) for Williamson fluid with varying temperature and concentration in an inclined channel with variable viscosity has been examined. The perturbation technique in terms of the Weissenberg number to obtain explicit forms for the velocity field has been used. All the solutions of physical parameters of the Darcy parameter , Reynolds number , Peclet number and Magnetic parameter are discussed under the different values as shown in plots.
The problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.