In the present research, the nuclear deformation of the Ne, Mg, Si, S, Ar, and Kr even–even isotopes has been investigated within the framework of Hartree–Fock–Bogoliubov method and SLy4 Skyrme parameterization. In particular, the deform shapes of the effect of nucleons collective motion by coupling between the single-particle motion and the potential surface have been studied. Furthermore, binding energy, the single-particle nuclear density distributions, the corresponding nuclear radii, and quadrupole deformation parameter have been also calculated and compared with the available experimental data. From the outcome of our investigation, it is possible to conclude that the deforming effects cannot be neglected in a characterization of the structure of the neutron-rich nuclei. The relation between the single-particle motion and the potential surface leads to note that the change in the interactions between the nucleons causes the evolution of nuclear surface and leads to variation in the potential shape.
The growth curves of the children are the most commonly used tools to assess the general welfare of society. Particularity child being one of the pillars to develop society; through these tools, we can path a child's growth physiology. The Centile line is of the important tools to build these curves, which give an accurate interpretation of the information society, also respond with illustration variable age. To build standard growth curves for BMI, we use BMI as an index. LMSP method used for finding the Centile line which depends on four curves represents Median, Coefficient of Variation, Skews, and Kurtosis. These can be obtained by modeling four parameters as nonparametric Smoothing functions for the illustration variable. Ma
... Show MoreMany fuzzy clustering are based on within-cluster scatter with a compactness measure , but in this paper explaining new fuzzy clustering method which depend on within-cluster scatter with a compactness measure and between-cluster scatter with a separation measure called the fuzzy compactness and separation (FCS). The fuzzy linear discriminant analysis (FLDA) based on within-cluster scatter matrix and between-cluster scatter matrix . Then two fuzzy scattering matrices in the objective function assure the compactness between data elements and cluster centers .To test the optimal number of clusters using validation clustering method is discuss .After that an illustrate example are applied.
The usage of remote sensing techniques in managing and monitoring the environmental areas is increasing due to the improvement of the sensors used in the observation satellites around the earth. Resolution merge process is used to combine high resolution one band image with another one that have low resolution multi bands image to produce one image that is high in both spatial and spectral resolution. In this work different merging methods were tested to evaluate their enhancement capabilities to extract different environmental areas; Principle component analysis (PCA), Brovey, modified (Intensity, Hue ,Saturation) method and High Pass Filter methods were tested and subjected to visual and statistical comparison for evaluation. Both visu
... Show MoreIn this article, the inverse source problem is determined by the partition hyperbolic equation under the left end flux tension of the string, where the extra measurement is considered. The approximate solution is obtained in the form of splitting and applying the finite difference method (FDM). Moreover, this problem is ill-posed, dealing with instability of force after adding noise to the additional condition. To stabilize the solution, the regularization matrix is considered. Consequently, it is proved by error estimates between the regularized solution and the exact solution. The numerical results show that the method is efficient and stable.
Segmentation of real world images considered as one of the most challenging tasks in the computer vision field due to several issues that associated with this kind of images such as high interference between object foreground and background, complicated objects and the pixels intensities of the object and background are almost similar in some cases. This research has introduced a modified adaptive segmentation process with image contrast stretching namely Gamma Stretching to improve the segmentation problem. The iterative segmentation process based on the proposed criteria has given the flexibility to the segmentation process in finding the suitable region of interest. As well as, the using of Gamma stretching will help in separating the
... Show MoreIn this paper, we present a Branch and Bound (B&B) algorithm of scheduling (n) jobs on a single machine to minimize the sum total completion time, total tardiness, total earliness, number of tardy jobs and total late work with unequal release dates. We proposed six heuristic methods for account upper bound. Also to obtain lower bound (LB) to this problem we modified a (LB) select from literature, with (Moore algorithm and Lawler's algorithm). And some dominance rules were suggested. Also, two special cases were derived. Computational experience showed the proposed (B&B) algorithm was effective in solving problems with up to (16) jobs, also the upper bounds and the lower bound were effective in restr
... Show MoreThe majority of systems dealing with natural language processing (NLP) and artificial intelligence (AI) can assist in making automated and automatically-supported decisions. However, these systems may face challenges and difficulties or find it confusing to identify the required information (characterization) for eliciting a decision by extracting or summarizing relevant information from large text documents or colossal content. When obtaining these documents online, for instance from social networking or social media, these sites undergo a remarkable increase in the textual content. The main objective of the present study is to conduct a survey and show the latest developments about the implementation of text-mining techniqu
... Show MoreIn this article, a new efficient approach is presented to solve a type of partial differential equations, such (2+1)-dimensional differential equations non-linear, and nonhomogeneous. The procedure of the new approach is suggested to solve important types of differential equations and get accurate analytic solutions i.e., exact solutions. The effectiveness of the suggested approach based on its properties compared with other approaches has been used to solve this type of differential equations such as the Adomain decomposition method, homotopy perturbation method, homotopy analysis method, and variation iteration method. The advantage of the present method has been illustrated by some examples.
In this paper, one of the Machine Scheduling Problems is studied, which is the problem of scheduling a number of products (n-jobs) on one (single) machine with the multi-criteria objective function. These functions are (completion time, the tardiness, the earliness, and the late work) which formulated as . The branch and bound (BAB) method are used as the main method for solving the problem, where four upper bounds and one lower bound are proposed and a number of dominance rules are considered to reduce the number of branches in the search tree. The genetic algorithm (GA) and the particle swarm optimization (PSO) are used to obtain two of the upper bounds. The computational results are calculated by coding (progr
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More