Numerous integral and local electron density’s topological parameters of significant metal-metal and metal-ligand bonding interactions in a trinuclear tetrahydrido cluster [(Cp* Ir) (Cp Ru)2 (μ3-H) (μ-H)3]1 (Cp = η5 -C5Me5), (Cp* = η5 -C5Me4Et) were calculated and interpreted by using the quantum theory of atoms in molecules (QTAIM). The properties of bond critical points such as the delocalization indices δ (A, B), the electron density ρ(r), the local kinetic energy density G(r), the Laplacian of the electron density ∇2ρ(r), the local energy density H(r), the local potential energy density V(r) and ellipticity ε(r) are compared with data from earlier organometallic system studies. A comparison of the topological processes of different atom-atom interactions has become possible thanks to these results. In the core of the heterometallic tetrahydrido cluster, the Ru2IrH4 part, the calculations showed that there are no bond critical points (BCPs) or identical bond paths (BPs) between Ru-Ru and Ru-Ir. The distribution of electron densities is determined by the position of bridging hydride atoms coordinated to Ru-Ru and Ru-Ir, which significantly affects the bonds between these transition metal atoms. On the other hand, the results confirm that the cluster under study contains a 7c–11e bonding interaction delocalized over M3H4, as shown by the non-negligible delocalization index calculations. The small values for ρ(b) above zero, together with the small values, again above zero, for Laplacian ∇2ρ(b) and the small positive values for total energy density H(b), are shown by the Ru-H and Ir-H bonds in this cluster is typical for open-shell interactions. Also, the topological data for the bond interactions between Ir and Ru metal atoms with the C atoms of the cyclopentadienyl Cp ring ligands are similar. They show properties very identical to open-shell interactions in the QTAIM classification.
The survival analysis is one of the modern methods of analysis that is based on the fact that the dependent variable represents time until the event concerned in the study. There are many survival models that deal with the impact of explanatory factors on the likelihood of survival, including the models proposed by the world, David Cox, one of the most important and common models of survival, where it consists of two functions, one of which is a parametric function that does not depend on the survival time and the other a nonparametric function that depends on times of survival, which the Cox model is defined as a semi parametric model, The set of parametric models that depend on the time-to-event distribution parameters such as
... Show MoreAbstract
In this study, we compare between the autoregressive approximations (Yule-Walker equations, Least Squares , Least Squares ( forward- backword ) and Burg’s (Geometric and Harmonic ) methods, to determine the optimal approximation to the time series generated from the first - order moving Average non-invertible process, and fractionally - integrated noise process, with several values for d (d=0.15,0.25,0.35,0.45) for different sample sizes (small,median,large)for two processes . We depend on figure of merit function which proposed by author Shibata in 1980, to determine the theoretical optimal order according to min
... Show MoreMany of the key stream generators which are used in practice are LFSR-based in the sense that they produce the key stream according to a rule y = C(L(x)), where L(x) denotes an internal linear bit stream, produced by small number of parallel linear feedback shift registers (LFSRs), and C denotes some nonlinear compression function. In this paper we combine between the output sequences from the linear feedback shift registers with the sequences out from non linear key generator to get the final very strong key sequence
A study to find the optimum separators pressures of separation stations has been performed. Stage separation of oil and gas is accomplished with a series of separators operating at sequentially reduced pressures. Liquid is discharged from a higher-pressure separator into the lower-pressure separator. The set of working separator pressures that yields maximum recovery of liquid hydrocarbon from the well fluid is the optimum set of pressures, which is the target of this work.
A computer model is used to find the optimum separator pressures. The model employs the Peng-Robinson equation of state (Peng and Robinson 1976) for volatile oil. The application of t
A novel robust finite time disturbance observer (RFTDO) based on an independent output-finite time composite control (FTCC) scheme is proposed for an air conditioning-system temperature and humidity regulation. The variable air volume (VAV) of the system is represented by two first-order mathematical models for the temperature and humidity dynamics. In the temperature loop dynamics, a RFTDO temperature (RFTDO-T) and an FTCC temperature (FTCC-T) are designed to estimate and reject the lumped disturbances of the temperature subsystem. In the humidity loop, a robust output of the FTCC humidity (FTCC-H) and RFTDO humidity (RFTDO-H) are also designed to estimate and reject the lumped disturbances of the humidity subsystem. Based on Lyapunov theo
... Show MoreThis paper including a gravitational lens time delays study for a general family of lensing potentials, the popular singular isothermal elliptical potential (SIEP), and singular isothermal elliptical density distribution (SIED) but allows general angular structure. At first section there is an introduction for the selected observations from the gravitationally lensed systems. Then section two shows that the time delays for singular isothermal elliptical potential (SIEP) and singular isothermal elliptical density distributions (SIED) have a remarkably simple and elegant form, and that the result for Hubble constant estimations actually holds for a general family of potentials by combining the analytic results with data for the time dela
... Show MoreIn this paper, the survival function has been estimated for the patients with lung cancer using different parametric estimation methods depending on sample for completing real data which explain the period of survival for patients who were ill with the lung cancer based on the diagnosis of disease or the entire of patients in a hospital for a time of two years (starting with 2012 to the end of 2013). Comparisons between the mentioned estimation methods has been performed using statistical indicator mean squares error, concluding that the estimation of the survival function for the lung cancer by using pre-test singles stage shrinkage estimator method was the best . <
... Show MoreThis paper is concerned with introducing and studying the first new approximation operators using mixed degree system and second new approximation operators using mixed degree system which are the core concept in this paper. In addition, the approximations of graphs using the operators first lower and first upper are accurate then the approximations obtained by using the operators second lower and second upper sincefirst accuracy less then second accuracy. For this reason, we study in detail the properties of second lower and second upper in this paper. Furthermore, we summarize the results for the properties of approximation operators second lower and second upper when the graph G is arbitrary, serial 1, serial 2, reflexive, symmetric, tra
... Show MoreWe introduce and discus recent type of fibrewise topological spaces, namely fibrewise bitopological spaces, Also, we introduce the concepts of fibrewise closed bitopological spaces, fibrewise open bitopological spaces, fibrewise locally sliceable bitopological spaces and fibrewise locally sectionable bitopological spaces. Furthermore, we state and prove several propositions concerning with these concepts.
In this paper, we used four classification methods to classify objects and compareamong these methods, these are K Nearest Neighbor's (KNN), Stochastic Gradient Descentlearning (SGD), Logistic Regression Algorithm(LR), and Multi-Layer Perceptron (MLP). Weused MCOCO dataset for classification and detection the objects, these dataset image wererandomly divided into training and testing datasets at a ratio of 7:3, respectively. In randomlyselect training and testing dataset images, converted the color images to the gray level, thenenhancement these gray images using the histogram equalization method, resize (20 x 20) fordataset image. Principal component analysis (PCA) was used for feature extraction, andfinally apply four classification metho
... Show More