Rutting has a significant impact on the pavements' performance. Rutting depth is often used as a parameter to assess the quality of pavements. The Asphalt Institute (AI) design method prescribes a maximum allowable rutting depth of 13mm, whereas the AASHTO design method stipulates a critical serviceability index of 2.5 which is equivalent to an average rutting depth of 15mm. In this research, static and repeated compression tests were performed to evaluate the permanent strain based on (1) the relationship between mix properties (asphalt content and type), and (2) testing temperature. The results indicated that the accumulated plastic strain was higher during the repeated load test than that during the static load tests. Notably, temperature played a major role. The power-law model was used to describe the relationship between the accumulated permanent strain and the number of load repetitions. Furthermore, graphical analysis was performed using VESYS 5W to predict the rut depth for the asphalt concrete layer. The α and µ parameters affected the predicted rut depth significantly. The results show a substantial difference between the two tests, indicating that the repeated load test is more adequate, useful, and accurate when compared with the static load test for the evaluation of the rut depth.
In 2020 one of the researchers in this paper, in his first research, tried to find out the Modified Weighted Pareto Distribution of Type I by using the Azzalini method for weighted distributions, which contain three parameters, two of them for scale while the third for shape.This research compared the distribution with two other distributions from the same family; the Standard Pareto Distribution of Type I and the Generalized Pareto Distribution by using the Maximum likelihood estimator which was derived by the researchers for Modified Weighted Pareto Distribution of Type I, then the Mont Carlo method was used–that is one of the simulation manners for generating random samples data in different sizes ( n= 10,30,50), and in di
... Show MoreThis work presents a comparison between the Convolutional Encoding CE, Parallel Turbo code and Low density Parity Check (LDPC) coding schemes with a MultiUser Single Output MUSO Multi-Carrier Code Division Multiple Access (MC-CDMA) system over multipath fading channels. The decoding technique used in the simulation was iterative decoding since it gives maximum efficiency at higher iterations. Modulation schemes used is Quadrature Amplitude Modulation QAM. An 8 pilot carrier were
used to compensate channel effect with Least Square Estimation method. The channel model used is Long Term Evolution (LTE) channel with Technical Specification TS 25.101v2.10 and 5 MHz bandwidth bandwidth including the channels of indoor to outdoor/ pedestrian
Achieving an accurate and optimal rate of penetration (ROP) is critical for a cost-effective and safe drilling operation. While different techniques have been used to achieve this goal, each approach has limitations, prompting researchers to seek solutions. This study’s objective is to conduct the strategy of combining the Bourgoyne and Young (BYM) ROP equations with Bagging Tree regression in a southern Iraqi field. Although BYM equations are commonly used and widespread to estimate drilling rates, they need more specific drilling parameters to capture different ROP complexities. The Bagging Tree algorithm, a random forest variant, addresses these limitations by blending domain kno
The partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.
Due to a party's violation of his obligations or responsibilities indicated in the contract, many engineering projects confront extensive contractual disputes, which in turn need arbitration or other forms of dispute resolution, which negatively impact the project's outcome. Each contract has its terms for dispute resolution. Therefore, this paper aims to study the provisions for dispute resolution according to Iraqi (SBDW) and the JCT (SBC/Q2016) and also to show the extent of the difference between the two contracts in the application of these provisions. The methodology includes a detailed study of the dispute settlement provisions for both contracts with a comparative analysis to identify the differences in the appli
... Show MoreAbstract
The Non - Homogeneous Poisson process is considered as one of the statistical subjects which had an importance in other sciences and a large application in different areas as waiting raws and rectifiable systems method , computer and communication systems and the theory of reliability and many other, also it used in modeling the phenomenon that occurred by unfixed way over time (all events that changed by time).
This research deals with some of the basic concepts that are related to the Non - Homogeneous Poisson process , This research carried out two models of the Non - Homogeneous Poisson process which are the power law model , and Musa –okumto , to estimate th
... Show MoreThe logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreVideo represented by a large number of frames synchronized with audio making video saving requires more storage, it's delivery slower, and computation cost expensive. Video summarization provides entire video information in minimum amount of time. This paper proposes static and dynamic video summarization
methods. The proposed static video summarization method includes several steps which are extracting frames from video, keyframes selection, feature extraction and description, and matching feature descriptor with bag of visual words, and finally save frames when features matched. The proposed dynamic video summarization
method includes in general extracting audio from video, calculating audio features
A condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.
Simulation study was done for a varieties the model. using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.