This study aims to numerically simulate the flow of the salt wedge by using computational fluid dynamics, CFD. The accuracy of the numerical simulation model was assessed against published laboratory data. Twelve CFD model runs were conducted under the same laboratory conditions. The results showed that the propagation of the salt wedge is inversely proportional to the applied freshwater discharge and the bed slope of the flume. The maximum propagation is obtained at the lowest discharge value and the minimum slope of the flume. The comparison between the published laboratory results and numerical simulation shows a good agreement. The range of the relative error varies between 0 and 16% with an average of 2% and a root mean square error of 0.18. Accordingly, the CFD software is quite valid to simulate the propagation of the salt wedge.
The question of estimation took a great interest in some engineering, statistical applications, various applied, human sciences, the methods provided by it helped to identify and accurately the many random processes.
In this paper, methods were used through which the reliability function, risk function, and estimation of the distribution parameters were used, and the methods are (Moment Method, Maximum Likelihood Method), where an experimental study was conducted using a simulation method for the purpose of comparing the methods to show which of these methods are competent in practical application This is based on the observations generated from the Rayleigh logarithmic distribution (RL) with sample sizes
... Show MoreThis deals with estimation of Reliability function and one shape parameter (?) of two- parameters Burr – XII , when ?(shape parameter is known) (?=0.5,1,1.5) and also the initial values of (?=1), while different sample shze n= 10, 20, 30, 50) bare used. The results depend on empirical study through simulation experiments are applied to compare the four methods of estimation, as well as computing the reliability function . The results of Mean square error indicates that Jacknif estimator is better than other three estimators , for all sample size and parameter values
In this research, the focus was placed on estimating the parameters of the Hypoexponential distribution function using the maximum likelihood method and genetic algorithm. More than one standard, including MSE, has been adopted for comparison by Using the simulation method
This research discussed, the process of comparison between the regression model of partial least squares and tree regression, where these models included two types of statistical methods represented by the first type "parameter statistics" of the partial least squares, which is adopted when the number of variables is greater than the number of observations and also when the number of observations larger than the number of variables, the second type is the "nonparametric statistic" represented by tree regression, which is the division of data in a hierarchical way. The regression models for the two models were estimated, and then the comparison between them, where the comparison between these methods was according to a Mean Square
... Show MoreIn this paper a new method is proposed to perform the N-Radon orthogonal frequency division multiplexing (OFDM), which are equivalent to 4-quadrature amplitude modulation (QAM), 16-QAM, 64-QAM, 256-QAM, ... etc. in spectral efficiency. This non conventional method is proposed in order to reduce the constellation energy and increase spectral efficiency. The proposed method gives a significant improvement in Bit Error Rate performance, and keeps bandwidth efficiency and spectrum shape as good as conventional Fast Fourier Transform based OFDM. The new structure was tested and compared with conventional OFDM for Additive White Gaussian Noise, flat, and multi-path selective fading channels. Simulation tests were generated for different channels
... Show MoreTo improve the efficiency of a processor in recent multiprocessor systems to deal with data, cache memories are used to access data instead of main memory which reduces the latency of delay time. In such systems, when installing different caches in different processors in shared memory architecture, the difficulties appear when there is a need to maintain consistency between the cache memories of different processors. So, cache coherency protocol is very important in such kinds of system. MSI, MESI, MOSI, MOESI, etc. are the famous protocols to solve cache coherency problem. We have proposed in this research integrating two states of MESI's cache coherence protocol which are Exclusive and Modified, which responds to a request from reading
... Show MoreDeep drawing process to produce square cup is very complex process due to a lot of process parameters which control on this process, therefore associated with it many of defects such as earing, wrinkling and fracture. Study of the effect of some process parameters to determine the values of these parameters which give the best result, the distributions for the thickness and depths of the cup were used to estimate the effect of the parameters on the cup numerically, in addition to experimental verification just to the conditions which give the best numerical predictions in order to reduce the time, efforts and costs for producing square cup with less defects experimentally is the aim of this study. The numerical analysis is used to study
... Show MoreIn this research, the semiparametric Bayesian method is compared with the classical method to estimate reliability function of three systems : k-out of-n system, series system, and parallel system. Each system consists of three components, the first one represents the composite parametric in which failure times distributed as exponential, whereas the second and the third components are nonparametric ones in which reliability estimations depend on Kernel method using two methods to estimate bandwidth parameter h method and Kaplan-Meier method. To indicate a better method for system reliability function estimation, it has be
... Show MoreThis paper deals with the modeling of a preventive maintenance strategy applied to a single-unit system subject to random failures.
According to this policy, the system is subjected to imperfect periodic preventive maintenance restoring it to ‘as good as new’ with probability
p and leaving it at state ‘as bad as old’ with probability q. Imperfect repairs are performed following failures occurring between consecutive
preventive maintenance actions, i.e the times between failures follow a decreasing quasi-renewal process with parameter a. Considering the
average durations of the preventive and corrective maintenance actions a
... Show More