In this paper, we introduce and discuss an extended subclass〖 Ą〗_p^*(λ,α,γ) of meromorphic multivalent functions involving Ruscheweyh derivative operator. Coefficients inequality, distortion theorems, closure theorem for this subclass are obtained.
The process of selection assure the objective of receiving for chosen ones to high levels more than other ways , and the problem of this research came by these inquires (what is the variables of limits we must considered when first preliminaries selections for mini basket ? and what is the proper test that suits this category ? and is there any standards references it can be depend on it ?) also the aims of this research that knowing the limits variables to basketball mini and their tests as a indicators for preliminaries for mini basketball category in ages (9-12) years and specifies standards (modified standards degrees in following method) to tests results to some limits variables for research sample. Also the researchers depends on (16)
... Show MoreIn this paper, we propose new types of non-convex functions called strongly --vex functions and semi strongly --vex functions. We study some properties of these proposed functions. As an application of these functions in optimization problems, we discuss some optimality properties of the generalized nonlinear optimization problem for which we use, as an objective function, strongly --vex function and semi strongly --vex function.
In this paper we study and design two feed forward neural networks. The first approach uses radial basis function network and second approach uses wavelet basis function network to approximate the mapping from the input to the output space. The trained networks are then used in an conjugate gradient algorithm to estimate the output. These neural networks are then applied to solve differential equation. Results of applying these algorithms to several examples are presented
The aim of this paper is to approximate multidimensional functions by using the type of Feedforward neural networks (FFNNs) which is called Greedy radial basis function neural networks (GRBFNNs). Also, we introduce a modification to the greedy algorithm which is used to train the greedy radial basis function neural networks. An error bound are introduced in Sobolev space. Finally, a comparison was made between the three algorithms (modified greedy algorithm, Backpropagation algorithm and the result is published in [16]).
In this present paper, we obtain some differential subordination and superordination results, by using generalized operators for certain subclass of analytic functions in the open unit disk. Also, we derive some sandwich results.
In this research, optical absorption data (the imaginary part of the dielectric function Ɛ2 as a function of photon energy E) were re-analyzed for three samples of a-Si:H thin films using derivative methods trying to investigate the ambiguity that accompany the interpretation of the optical data of these film in order to obtainm the optical energy gap (Eg) and the factor (r) which in concerned with the density of state distribution near the mobility edge directly without the need for a pre- assumption for the factor r usually followed in traditional methods such as the Tauc plot. The derivative method was used for two choices for the factor q (which in connected with the dependence of the dipole matrix element on the photon energy ) for
... Show More