We wrote this paper to proffer new types of the perfectly supra continuous functions. We also introduced new types of supra continuous, supra open and supra closed functions.
The major target of this paper is to study a confirmed class of meromorphic univalent functions . We procure several results, such as those related to coefficient estimates, distortion and growth theorem, radii of starlikeness, and convexity for this class, n additionto hadamard product, convex combination, closure theorem, integral operators, and neighborhoods.
In this paper, we generalize many earlier differential operators which were studied by other researchers using our differential operator. We also obtain a new subclass of starlike functions to utilize some interesting properties.
Objective: To determine the effectiveness of hypothermia on renal functions for patients undergoing
coronary artery bypass graft CABG surgery.
Methodology: A purposive (non-probability) sample of (50) patients undergoing Isolated coronary artery
bypass graft surgery consecutively admitted to the surgical ward, and they were followed up in the
intraoperative, Intensive Care Unit (ICU) and in the postoperative (surgical ward). Post-operative renal function
test (glumeruler filteration rate (GFR) by using the Crockroft-Gault formula and serum creatinine level) was
determined first week post operative and post operative renal function was classified on the base of peak of
the serum creatinine level and decline of glomeru
The authors introduced and addressed several new subclasses of the family of meromorphically multivalent -star-like functions in the punctured unit disk in this study, which makes use of several higher order -derivatives. Many fascinating properties and characteristics are extracted systematically for each of these newly identified function classes. Distortion theorems and radius problems are among these characteristics and functions. A number of coefficient inequalities for functions belonging to the subclasses are studied, and discussed, as well as a suitable condition for them is set. The numerous results are presented in this study and the previous works on this
... Show MoreSome researchers are interested in using the flexible and applicable properties of quadratic functions as activation functions for FNNs. We study the essential approximation rate of any Lebesgue-integrable monotone function by a neural network of quadratic activation functions. The simultaneous degree of essential approximation is also studied. Both estimates are proved to be within the second order of modulus of smoothness.
The process of selection assure the objective of receiving for chosen ones to high levels more than other ways , and the problem of this research came by these inquires (what is the variables of limits we must considered when first preliminaries selections for mini basket ? and what is the proper test that suits this category ? and is there any standards references it can be depend on it ?) also the aims of this research that knowing the limits variables to basketball mini and their tests as a indicators for preliminaries for mini basketball category in ages (9-12) years and specifies standards (modified standards degrees in following method) to tests results to some limits variables for research sample. Also the researchers depends on (16)
... Show MoreIn this paper, we propose new types of non-convex functions called strongly --vex functions and semi strongly --vex functions. We study some properties of these proposed functions. As an application of these functions in optimization problems, we discuss some optimality properties of the generalized nonlinear optimization problem for which we use, as an objective function, strongly --vex function and semi strongly --vex function.
Some relations of inclusion and their properties are investigated for functions of type " -valent that involves the generalized operator of Srivastava-Attiya by using the principle of strong differential subordination.
In this paper we study and design two feed forward neural networks. The first approach uses radial basis function network and second approach uses wavelet basis function network to approximate the mapping from the input to the output space. The trained networks are then used in an conjugate gradient algorithm to estimate the output. These neural networks are then applied to solve differential equation. Results of applying these algorithms to several examples are presented