In the present research, the nuclear deformation of the Ne, Mg, Si, S, Ar, and Kr even–even isotopes has been investigated within the framework of Hartree–Fock–Bogoliubov method and SLy4 Skyrme parameterization. In particular, the deform shapes of the effect of nucleons collective motion by coupling between the single-particle motion and the potential surface have been studied. Furthermore, binding energy, the single-particle nuclear density distributions, the corresponding nuclear radii, and quadrupole deformation parameter have been also calculated and compared with the available experimental data. From the outcome of our investigation, it is possible to conclude that the deforming effects cannot be neglected in a characterization of the structure of the neutron-rich nuclei. The relation between the single-particle motion and the potential surface leads to note that the change in the interactions between the nucleons causes the evolution of nuclear surface and leads to variation in the potential shape.
In this article, a new efficient approach is presented to solve a type of partial differential equations, such (2+1)-dimensional differential equations non-linear, and nonhomogeneous. The procedure of the new approach is suggested to solve important types of differential equations and get accurate analytic solutions i.e., exact solutions. The effectiveness of the suggested approach based on its properties compared with other approaches has been used to solve this type of differential equations such as the Adomain decomposition method, homotopy perturbation method, homotopy analysis method, and variation iteration method. The advantage of the present method has been illustrated by some examples.
In this paper, one of the Machine Scheduling Problems is studied, which is the problem of scheduling a number of products (n-jobs) on one (single) machine with the multi-criteria objective function. These functions are (completion time, the tardiness, the earliness, and the late work) which formulated as . The branch and bound (BAB) method are used as the main method for solving the problem, where four upper bounds and one lower bound are proposed and a number of dominance rules are considered to reduce the number of branches in the search tree. The genetic algorithm (GA) and the particle swarm optimization (PSO) are used to obtain two of the upper bounds. The computational results are calculated by coding (progr
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
In the present work, an image compression method have been modified by combining The Absolute Moment Block Truncation Coding algorithm (AMBTC) with a VQ-based image coding. At the beginning, the AMBTC algorithm based on Weber's law condition have been used to distinguish low and high detail blocks in the original image. The coder will transmit only mean of low detailed block (i.e. uniform blocks like background) on the channel instate of transmit the two reconstruction mean values and bit map for this block. While the high detail block is coded by the proposed fast encoding algorithm for vector quantized method based on the Triangular Inequality Theorem (TIE), then the coder will transmit the two reconstruction mean values (i.e. H&L)
... Show MoreСтатья посвящена возможности использования в обучении русскому языку как иностранному лингвоориентированной методики для арабских студентов. Обосновывается термин «лингвоориентированная методика», предложенный В. Н. Вагнер, и на основе положений заявленной методики проводится сопоставление изучаемого (русского) языка с родным (арабским) языком обучающихся.
In this paper, the Azzallini’s method used to find a weighted distribution derived from the standard Pareto distribution of type I (SPDTI) by inserting the shape parameter (θ) resulting from the above method to cover the period (0, 1] which was neglected by the standard distribution. Thus, the proposed distribution is a modification to the Pareto distribution of the first type, where the probability of the random variable lies within the period The properties of the modified weighted Pareto distribution of the type I (MWPDTI) as the probability density function ,cumulative distribution function, Reliability function , Moment and the hazard function are found. The behaviour of probability density function for MWPDTI distrib
... Show MoreIn this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
In this study, different methods were used for estimating location parameter and scale parameter for extreme value distribution, such as maximum likelihood estimation (MLE) , method of moment estimation (ME),and approximation estimators based on percentiles which is called white method in estimation, as the extreme value distribution is one of exponential distributions. Least squares estimation (OLS) was used, weighted least squares estimation (WLS), ridge regression estimation (Rig), and adjusted ridge regression estimation (ARig) were used. Two parameters for expected value to the percentile as estimation for distribution f
... Show MoreThe main problem when dealing with fuzzy data variables is that it cannot be formed by a model that represents the data through the method of Fuzzy Least Squares Estimator (FLSE) which gives false estimates of the invalidity of the method in the case of the existence of the problem of multicollinearity. To overcome this problem, the Fuzzy Bridge Regression Estimator (FBRE) Method was relied upon to estimate a fuzzy linear regression model by triangular fuzzy numbers. Moreover, the detection of the problem of multicollinearity in the fuzzy data can be done by using Variance Inflation Factor when the inputs variable of the model crisp, output variable, and parameters are fuzzed. The results were compared usin
... Show More