Artificial Intelligence Algorithms have been used in recent years in many scientific fields. We suggest employing flower pollination algorithm in the environmental field to find the best estimate of the semi-parametric regression function with measurement errors in the explanatory variables and the dependent variable, where measurement errors appear frequently in fields such as chemistry, biological sciences, medicine, and epidemiological studies, rather than an exact measurement. We estimate the regression function of the semi-parametric model by estimating the parametric model and estimating the non-parametric model, the parametric model is estimated by using an instrumental variables method (Wald method, Bartlett’s method, and Durbin’s method), The nonparametric model is estimated by using kernel smoothing (Nadaraya Watson), K-Nearest Neighbor smoothing and Median smoothing. The Flower Pollination algorithms were employed and structured in building the ecological model and estimating the semi-parametric regression function with measurement errors in the explanatory and dependent variables, then compare the models to choose the best model used in the environmental scope measurement errors, where the comparison between the models is done using the mean square error (MSE).
In this paper, we propose an approach to estimate the induced potential, which is generated by swift heavy ions traversing a ZnO thin film, via an energy loss function (ELF). This induced potential is related to the projectile charge density, ρq(k) and is described by the extended Drude dielectric function. At zero momentum transfer, the resulting ELF exhibits good agreement with the previously reported results. The ELF, obtained by the extended Drude model, displays a realistic behavior over the Bethe ridge. It is observed that the induced potential relies on the heavy ion velocity and charge state q. Further, the numerical results show that the induced potential for neutral H, as projectile, dominates when the heavy ion velocity is less
... Show MoreAbstract
The Phenomenon of Extremism of Values (Maximum or Rare Value) an important phenomenon is the use of two techniques of sampling techniques to deal with this Extremism: the technique of the peak sample and the maximum annual sampling technique (AM) (Extreme values, Gumbel) for sample (AM) and (general Pareto, exponential) distribution of the POT sample. The cross-entropy algorithm was applied in two of its methods to the first estimate using the statistical order and the second using the statistical order and likelihood ratio. The third method is proposed by the researcher. The MSE comparison coefficient of the estimated parameters and the probability density function for each of the distributions were
... Show MoreBackground: Powerlifters and bodybuilders use anabolic androgenic steroids (AAS) especially – as many as 55 percent of elite powerlifters admitted using these agents. In contrast to numerous documented toxic and hormonal effects of AAS their impact on the structure and function of the left ventricular (LV) was not yet fully understood.
ABSTRACT
Critical buckling temperature of angle-ply laminated plate is developed using a higher-order displacement field. This displacement field used by Mantari et al based on a constant ‘‘m’’, which is determined to give results closest to the three dimensions elasticity (3-D) theory. Equations of motion based on higher-order theory angle ply plates are derived through Hamilton, s principle, and solved using Navier-type solution to obtain critical buckling temperature for simply supported laminated plates. Changing (α2/ α1) ratios, number of layers, aspect ratios, E1/E2 ratios for thick and thin plates and their effect on thermal
... Show MoreThis paper deals to how to estimate points non measured spatial data when the number of its terms (sample spatial) a few, that are not preferred for the estimation process, because we also know that whenever if the data is large, the estimation results of the points non measured to be better and thus the variance estimate less, so the idea of this paper is how to take advantage of the data other secondary (auxiliary), which have a strong correlation with the primary data (basic) to be estimated single points of non-measured, as well as measuring the variance estimate, has been the use of technique Co-kriging in this field to build predictions spatial estimation process, and then we applied this idea to real data in th
... Show MoreIn this work, the performance of the receiver in a quantum cryptography system based on BB84 protocol is scaled by calculating the Quantum Bit Error Rate (QBER) of the receiver. To apply this performance test, an optical setup was arranged and a circuit was designed and implemented to calculate the QBER. This electronic circuit is used to calculate the number of counts per second generated by the avalanche photodiodes set in the receiver. The calculated counts per second are used to calculate the QBER for the receiver that gives an indication for the performance of the receiver. Minimum QBER, 6%, was obtained with avalanche photodiode excess voltage equals to 2V and laser diode power of 3.16 nW at avalanche photodiode temperature of -10
... Show MoreAchieving reliable operation under the influence of deep-submicrometer noise sources including crosstalk noise at low voltage operation is a major challenge for network on chip links. In this paper, we propose a coding scheme that simultaneously addresses crosstalk effects on signal delay and detects up to seven random errors through wire duplication and simple parity checks calculated over the rows and columns of the two-dimensional data. This high error detection capability enables the reduction of operating voltage on the wire leading to energy saving. The results show that the proposed scheme reduces the energy consumption up to 53% as compared to other schemes at iso-reliability performance despite the increase in the overhead number o
... Show MoreError control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high
... Show More