The settlement evaluation for the jet grouted columns (JGC) in soft soils is a problematic matter, because it is influenced by the number of aspects such as soil type, effect mixture between soil and grouting materials, nozzle energy, jet grouting, water flow rate, rotation and lifting speed. Most methods of design the jet-grouting column based on experience. In this study, a prototype single and group jet grouting models (single, 1*2, and 2*2) with the total length and diameter were (2000 and 150 mm) respectively and clear spacing (3D) has been constructed in soft clay and subjected to vertical axial loads. Furthermore, different theoretical methods have been used for the estimation of (JGC) settlement. Pile load settlement analysis of the jet grout columns showed that the average settlement values were (0.41, 0.663, and 1.5 mm) for the single, group (1*2) and group (2*2) jet grouted columns respectively. While, in the theoretical methods give a higher value of the settlement (2.0, 3.48, and 5.24 mm) for the single, group (1*2) and group (2*2) jet grouted columns compared with the settlement results acquired from field pile load test data. Therefore, it is not recommended to be used for soft clay. On the other hand, Fuller and Hoy’s, Hansen’s 90%, and Butler and Hoy’s results may be considered faithful interpretation methods for the single and group (1*2 and 2*2) (JGC).
Recently, wireless communication environments with high speeds and low complexity have become increasingly essential. Free-space optics (FSO) has emerged as a promising solution for providing direct connections between devices in such high-spectrum wireless setups. However, FSO communications are susceptible to weather-induced signal fluctuations, leading to fading and signal weakness at the receiver. To mitigate the effects of these challenges, several mathematical models have been proposed to describe the transition from weak to strong atmospheric turbulence, including Rayleigh, lognormal, Málaga, Nakagami-m, K-distribution, Weibull, Negative-Exponential, Inverse-Gaussian, G-G, and Fisher-Snedecor F distributions. This paper extensive
... Show MoreClassification of imbalanced data is an important issue. Many algorithms have been developed for classification, such as Back Propagation (BP) neural networks, decision tree, Bayesian networks etc., and have been used repeatedly in many fields. These algorithms speak of the problem of imbalanced data, where there are situations that belong to more classes than others. Imbalanced data result in poor performance and bias to a class without other classes. In this paper, we proposed three techniques based on the Over-Sampling (O.S.) technique for processing imbalanced dataset and redistributing it and converting it into balanced dataset. These techniques are (Improved Synthetic Minority Over-Sampling Technique (Improved SMOTE), Border
... Show MoreHuman detection represents a main problem of interest when using video based monitoring. In this paper, artificial neural networks, namely multilayer perceptron (MLP) and radial basis function (RBF) are used to detect humans among different objects in a sequence of frames (images) using classification approach. The classification used is based on the shape of the object instead of depending on the contents of the frame. Initially, background subtraction is depended to extract objects of interest from the frame, then statistical and geometric information are obtained from vertical and horizontal projections of the objects that are detected to stand for the shape of the object. Next to this step, two ty
... Show MoreIn this paper, we made comparison among different parametric ,nonparametric and semiparametric estimators for partial linear regression model users parametric represented by ols and nonparametric methods represented by cubic smoothing spline estimator and Nadaraya-Watson estimator, we study three nonparametric regression models and samples sizes n=40,60,100,variances used σ2=0.5,1,1.5 the results for the first model show that N.W estimator for partial linear regression model(PLM) is the best followed the cubic smoothing spline estimator for (PLM),and the results of the second and the third model show that the best estimator is C.S.S.followed by N.W estimator for (PLM) ,the
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes
The current study aims to compare between the assessments of the Rush model’s parameters to the missing and completed data in various ways of processing the missing data. To achieve the aim of the present study, the researcher followed the following steps: preparing Philip Carter test for the spatial capacity which consists of (20) items on a group of (250) sixth scientific stage students in the directorates of Baghdad Education at Al–Rusafa (1st, 2nd and 3rd) for the academic year (2018-2019). Then, the researcher relied on a single-parameter model to analyze the data. The researcher used Bilog-mg3 model to check the hypotheses, data and match them with the model. In addition
... Show MoreThe logistic regression model regarded as the important regression Models ,where of the most interesting subjects in recent studies due to taking character more advanced in the process of statistical analysis .
The ordinary estimating methods is failed in dealing with data that consist of the presence of outlier values and hence on the absence of such that have undesirable effect on the result. &nbs
... Show MoreMany of the dynamic processes in different sciences are described by models of differential equations. These models explain the change in the behavior of the studied process over time by linking the behavior of the process under study with its derivatives. These models often contain constant and time-varying parameters that vary according to the nature of the process under study in this We will estimate the constant and time-varying parameters in a sequential method in several stages. In the first stage, the state variables and their derivatives are estimated in the method of penalized splines(p- splines) . In the second stage we use pseudo lest square to estimate constant parameters, For the third stage, the rem
... Show MoreThis paper compare the accurecy of HF propagation prediction programs for HF circuits links between Iraq and different points world wide during August 2018 when solar cycle 24 (start 2009 end 2020) is at minimun activity and also find out the best communication mode used. The prediction programs like Voice of America Coverage Analysis Program (VOACAP) and ITU Recommendation RS 533 (REC533 ) had been used to generat HF circuit link parameters like Maximum Usable Frequency ( MUF) and Frequency of Transsmision (FOT) .Depending on the predicted parameters (data) , real radio contacts had been done using a radio transceiver from Icom model IC 7100 with 100W RF
... Show More