Predicting vertical stress was indeed useful for controlling geomechanical issues since it allowed for the computation of pore pressure for the formation and the classification of fault regimes. This study provides an in-depth observation of vertical stress prediction utilizing numerous approaches using the Techlog 2015 software. Gardner's method results in incorrect vertical stress values with a problem that this method doesn't start from the surface and instead relies only on sound log data. Whereas the Amoco, Wendt non-acoustic, Traugott, average technique simply needed density log as input and used a straight line as the observed density, this was incorrect for vertical computing stress. The results of these methods show that extrapolated density measurement used an average for the real density. The gradient of an extrapolated method is much better in shallow depth into the vertical stress calculations. The Miller density method had an excellent fit with the real density in deep depth. It has been crucial to calculate vertical stress for the past 40 years because calculating pore pressure and geomechanical building models have employed vertical stress as input. The strongest predictor of vertical stress may have been bulk density. According to these results, the miller and extrapolated techniques may be the best two methods for determining vertical stress. Still, the gradient of an extrapolated method is much more excellent in shallow depth than the miller method. Extrapolated density approach may produce satisfactory results for vertical stress, while miller values are lower than those obtained by extrapolating. This may be due to the poor gradient of this method at shallow depths. Gardner's approach incorrectly displays minimum values of about 4000 psi at great depths. While other methods provide numbers that are similar because these methods use constant bulk density values that start at the surface and continue to the desired depth, this is incorrect.
The researcher studied transportation problem because it's great importance in the country's economy. This paper which ware studied several ways to find a solution closely to the optimization, has applied these methods to the practical reality by taking one oil derivatives which is benzene product, where the first purpose of this study is, how we can reduce the total costs of transportation for product of petrol from warehouses in the province of Baghdad, to some stations in the Karsh district and Rusafa in the same province. Secondly, how can we address the Domandes of each station by required quantity which is depending on absorptive capacity of the warehouses (quantities supply), And through r
... Show MoreThe parametric programming considered as type of sensitivity analysis. In this research concerning to study the effect of the variations on linear programming model (objective function coefficients and right hand side) on the optimal solution. To determine the parameter (θ) value (-5≤ θ ≤5).Whereas the result، the objective function equal zero and the decision variables are non basic، when the parameter (θ = -5).The objective function value increases when the parameter (θ= 5) and the decision variables are basic، with the except of X24, X34.Whenever the parameter value increase, the objectiv
... Show MoreOndansetron HCl (OND) is a potent antiemetic drug used for control of nausea and vomiting associated with cancer chemotherapy. It exhibits only 60 – 70 % of oral bioavailability due to first pass metabolism and has a relative short half-life of 3-5 hours. Poor bioavailability not only leads to the frequent dosing but also shows very poor patient adherence. Hence, in the present study an approach has been made to develop OND nanoparticles using eudragit® RS100 and eudragit® RL100 polymer to control release of OND for transdermal delivery and to improve patient compliance.
Six formulas of OND nanoparticles were prepared using nanoprecipitation technique. The particles sizes and zeta potential were measured
... Show MoreThe Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show MoreThis paper interest to estimation the unknown parameters for generalized Rayleigh distribution model based on censored samples of singly type one . In this paper the probability density function for generalized Rayleigh is defined with its properties . The maximum likelihood estimator method is used to derive the point estimation for all unknown parameters based on iterative method , as Newton – Raphson method , then derive confidence interval estimation which based on Fisher information matrix . Finally , testing whether the current model ( GRD ) fits to a set of real data , then compute the survival function and hazard function for this real data.
Abstract
In this research, a study of the behavior and correlation between sunspot number (SSN) and solar flux (F10.7) have been suggested. The annual time of the years (2008-2017) of solar cycle 24 has been adopted to make the investigation in order to get the mutual correlation between (SSN) and (F10.7). The test results of the annual correlation between SSN & F10.7 is simple and can be represented by a linear regression equation. The results of the conducted study showed that there was a good fit between SSN and F10.7 values that have been generated using the suggested mutual correlation equation and the observed data.
This study includes Estimating scale parameter, location parameter and reliability function for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).
Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)
... Show MoreA new pavement technology has been developed in Highway engineering: asphalt pavement production is less susceptible to oxidation and the consequent damages. The warm mix asphalt (WMA) is produced at a temperature of about (10-40) oC lower than the hot asphalt paving. This is done using one of the methods of producing a WMA. Although WMA's performance is rather good, according to previous studies, as it is less susceptible to oxidation, it is possible to modify some of its properties using different materials, including polymers. Waste tires of vehicles are one of the types of polymers because of their flexible properties. The production of HMA, WMA, and WMA modified with proportions of (1, 1.5, and 2%) of rub
... Show MoreThe study investigated the behaviour of asphalt concrete mixes for aggregate gradations, according to the Iraqi specification using the Bailey method designed by an Excel spreadsheet. In mixing aggregates with varying gradations (coarse and fine aggregate), The Bailey method is a systematic methodology that offers aggregate interlocking as the backbone of the framework and a controlled gradation to complete the blends. Six types of gradation are used according to the bailey method considered in this study. Two-course prepared Asphalt Concrete Wearing and Asphalt Concrete binder, the Nominal Maximum Aggregate Sizes (NMAS) of the mixtures are 19 and 12.5 mm, respectively. The total number of specimens was 240 for both layers (15 samp
... Show More