The substantial key to initiate an explicit statistical formula for a physically specified continua is to consider a derivative expression, in order to identify the definitive configuration of the continua itself. Moreover, this statistical formula is to reflect the whole distribution of the formula of which the considered continua is the most likely to be dependent. However, a somewhat mathematically and physically tedious path to arrive at the required statistical formula is needed. The procedure in the present research is to establish, modify, and implement an optimized amalgamation between Airy stress function for elastically-deformed media and the multi-canonical joint probability density functions for multivariate distribution completion, so that the developed distribution is to exhibit a sophisticated illustration of yield probability distribution along a cantilever beam whose structure is subjected to a linearly-distributed load. This combinatorial approach is to clarify the intensity of the stresses exerted onto the beam, to standardize the terms of stresses and their affection and to convert them into a more significant depiction of a probability distribution.
The essential objective of this paper is to introduce new notions of fibrewise topological spaces on D that are named to be upper perfect topological spaces, lower perfect topological spaces, multi-perfect topological spaces, fibrewise upper perfect topological spaces, and fibrewise lower perfect topological spaces. fibrewise multi-perfect topological spaces, filter base, contact point, rigid, multi-rigid, multi-rigid, fibrewise upper weakly closed, fibrewise lower weakly closed, fibrewise multi-weakly closed, set, almost upper perfect, almost lower perfect, almost multi-perfect, fibrewise almost upper perfect, fibrewise almost lower perfect, fibrewise almost multi-perfect, upper* continuous fibrewise upper∗ topol
... Show MoreIn this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show MoreIn this paper, two of the local search algorithms are used (genetic algorithm and particle swarm optimization), in scheduling number of products (n jobs) on a single machine to minimize a multi-objective function which is denoted as (total completion time, total tardiness, total earliness and the total late work). A branch and bound (BAB) method is used for comparing the results for (n) jobs starting from (5-18). The results show that the two algorithms have found the optimal and near optimal solutions in an appropriate times.
In this article, the partially ordered relation is constructed in geodesic spaces by betweeness property, A monotone sequence is generated in the domain of monotone inward mapping, a monotone inward contraction mapping is a monotone Caristi inward mapping is proved, the general fixed points for such mapping is discussed and A mutlivalued version of these results is also introduced.
In this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent root-
... Show MoreIn this paper three techniques for image compression are implemented. The proposed techniques consist of three dimension (3-D) two level discrete wavelet transform (DWT), 3-D two level discrete multi-wavelet transform (DMWT) and 3-D two level hybrid (wavelet-multiwavelet transform) technique. Daubechies and Haar are used in discrete wavelet transform and Critically Sampled preprocessing is used in discrete multi-wavelet transform. The aim is to maintain to increase the compression ratio (CR) with respect to increase the level of the transformation in case of 3-D transformation, so, the compression ratio is measured for each level. To get a good compression, the image data properties, were measured, such as, image entropy (He), percent r
... Show MoreThe basic concepts of some near open subgraphs, near rough, near exact and near fuzzy graphs are introduced and sufficiently illustrated. The Gm-closure space induced by closure operators is used to generalize the basic rough graph concepts. We introduce the near exactness and near roughness by applying the near concepts to make more accuracy for definability of graphs. We give a new definition for a membership function to find near interior, near boundary and near exterior vertices. Moreover, proved results, examples and counter examples are provided. The Gm-closure structure which suggested in this paper opens up the way for applying rich amount of topological facts and methods in the process of granular computing.
The main aim of this paper is to use the notion which was introduced in [1], to offered new classes of separation axioms in ideal spaces. So, we offered new type of notions of convergence in ideal spaces via the set. Relations among several types of separation axioms that offered were explained.
A study to find the optimum separators pressures of separation stations has been performed. Stage separation of oil and gas is accomplished with a series of separators operating at sequentially reduced pressures. Liquid is discharged from a higher-pressure separator into the lower-pressure separator. The set of working separator pressures that yields maximum recovery of liquid hydrocarbon from the well fluid is the optimum set of pressures, which is the target of this work.
A computer model is used to find the optimum separator pressures. The model employs the Peng-Robinson equation of state (Peng and Robinson 1976) for volatile oil. The application of t
Many of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence