The aim of this paper is to approximate multidimensional functions by using the type of Feedforward neural networks (FFNNs) which is called Greedy radial basis function neural networks (GRBFNNs). Also, we introduce a modification to the greedy algorithm which is used to train the greedy radial basis function neural networks. An error bound are introduced in Sobolev space. Finally, a comparison was made between the three algorithms (modified greedy algorithm, Backpropagation algorithm and the result is published in [16]).
Maximum values of one particle radial electronic density distribution has been calculated by using Hartree-Fock (HF)wave function with data published by[A. Sarsa et al. Atomic Data and Nuclear Data Tables 88 (2004) 163–202] for K and L shells for some Be-like ions. The Results confirm that there is a linear behavior restricted the increasing of maximum points of one particle radial electronic density distribution for K and L shells throughout some Be-like ions. This linear behavior can be described by using the nth term formula of arithmetic sequence, that can be used to calculate the maximum radial electronic density distribution for any ion within Be like ions for Z<20.
When optimizing the performance of neural network-based chatbots, determining the optimizer is one of the most important aspects. Optimizers primarily control the adjustment of model parameters such as weight and bias to minimize a loss function during training. Adaptive optimizers such as ADAM have become a standard choice and are widely used for their invariant parameter updates' magnitudes concerning gradient scale variations, but often pose generalization problems. Alternatively, Stochastic Gradient Descent (SGD) with Momentum and the extension of ADAM, the ADAMW, offers several advantages. This study aims to compare and examine the effects of these optimizers on the chatbot CST dataset. The effectiveness of each optimizer is evaluat
... Show MoreIn this paper, some basic notions and facts in the b-modular space similar to those in the modular spaces as a type of generalization are given. For example, concepts of convergence, best approximate, uniformly convexity etc. And then, two results about relation between semi compactness and approximation are proved which are used to prove a theorem on the existence of best approximation for a semi-compact subset of b-modular space.
The process of selection assure the objective of receiving for chosen ones to high levels more than other ways , and the problem of this research came by these inquires (what is the variables of limits we must considered when first preliminaries selections for mini basket ? and what is the proper test that suits this category ? and is there any standards references it can be depend on it ?) also the aims of this research that knowing the limits variables to basketball mini and their tests as a indicators for preliminaries for mini basketball category in ages (9-12) years and specifies standards (modified standards degrees in following method) to tests results to some limits variables for research sample. Also the researchers depends on (16)
... Show MoreThe present article delves into the examination of groundwater quality, based on WQI, for drinking purposes in Baghdad City. Further, for carrying out the investigation, the data was collected from the Ministry of Water Resources of Baghdad, which represents water samples drawn from 114 wells in Al-Karkh and Al-Rusafa sides of Baghdad city. With the aim of further determining WQI, four water parameters such as (i) pH, (ii) Chloride (Cl), (iii) Sulfate (SO4), and (iv) Total dissolved solids (TDS), were taken into consideration. According to the computed WQI, the distribution of the groundwater samples, with respect to their quality classes such as excellent, good, poor, very poor and unfit for human drinking purpose, was found to be
... Show MoreIn this paper we introduce several estimators for Binwidth of histogram estimators' .We use simulation technique to compare these estimators .In most cases, the results proved that the rule of thumb estimator is better than other estimators.
Deconstruction theory is a theory that appeared After construction theory, and it tends, through some key principles, to reach the purposive and the main meaning of the text by the means of different perspectives. In other words, deconstruction is a critical literary theory and a contemporary philosophical approach that work together to reach exact concept of the text, and this is achieved through reading and analyzing the text. Therefore, deconstruction has specified some principles so as to reach the exact meaning of the text through these different principles.
پێشەكی:
تیۆری هەڵوەشاندنەوە تیۆرێكە پاش بوونیادگەری سەریهەڵداوە و دەیەوێت لەڕ
... Show MoreFluorescent proteins (FPs) have revolutionised the life sciences, but the chromophore maturation mechanism is still not fully understood. Here we photochemically trap maturation at a crucial stage and structurally characterise the intermediate.
The choice of binary Pseudonoise (PN) sequences with specific properties, having long period high complexity, randomness, minimum cross and auto- correlation which are essential for some communication systems. In this research a nonlinear PN generator is introduced . It consists of a combination of basic components like Linear Feedback Shift Register (LFSR), ?-element which is a type of RxR crossbar switches. The period and complexity of a sequence which are generated by the proposed generator are computed and the randomness properties of these sequences are measured by well-known randomness tests.