As the bit rate of fiber optic transmission systems is increased to more than , the system will suffer from an important random phenomena, which is called polarization mode dispersion. This phenomenon contributes effectively to: increasing pulse width, power decreasing, time jittering, and shape distortion. The time jittering means that the pulse center will shift to left or right. So that, time jittering leads to interference between neighboring pulses. On the other hand, increasing bit period will prevent the possibility of sending high rates. In this paper, an accurate mathematical analysis to increase the rates of transmission, which contain all physical random variables that contribute to determine the transmission rates, is presented. Thereafter, new mathematical expressions for: pulse power, peak power, time jittering, pulse width, and power penalty are derived. On the basis of these formulas, one can choose a certain operating values to reduce or prevent the effects of polarization mode dispersion.
ENGLISH
The Dirichlet process is an important fundamental object in nonparametric Bayesian modelling, applied to a wide range of problems in machine learning, statistics, and bioinformatics, among other fields. This flexible stochastic process models rich data structures with unknown or evolving number of clusters. It is a valuable tool for encoding the true complexity of real-world data in computer models. Our results show that the Dirichlet process improves, both in distribution density and in signal-to-noise ratio, with larger sample size; achieves slow decay rate to its base distribution; has improved convergence and stability; and thrives with a Gaussian base distribution, which is much better than the Gamma distribution. The performance depen
... Show MoreInvestigation of the adsorption of acid fuchsin dye (AFD) on Zeolite 5A is carried out using batch scale experiments according to statistical design. Adsorption isotherms, kinetics and thermodynamics were demonstrated. Results showed that the maximum removal efficiency was using zeolite at a temperature of 93.68751 mg/g. Experimental data was found to fit the Langmuir isotherm and pseudo second order kinetics with maximum removal of about 95%. Thermodynamic analysis showed an endothermic adsorption. Optimization was made for the most affecting operating variables and a model equation for the predicted efficiency was suggested.
This work is devoted to define new generalized gamma and beta functions involving the recently suggested seven-parameter Mittag-Leffler function, followed by a review of all related special cases. In addition, necessary investigations are affirmed for the new generalized beta function, including, Mellin transform, differential formulas, integral representations, and essential summation relations. Furthermore, crucial statistical application has been realized for the new generalized beta function.
To perform a secure evaluation of Indoor Design data, the research introduces a Cyber-Neutrosophic Model, which utilizes AES-256 encryption, Role-Based Access Control, and real-time anomaly detection. It measures the percentage of unpredictability, insecurity, and variance present within model features. Also, it provides reliable data security. Similar features have been identified between the final results of the study, corresponding to the Cyber-Neutrosophic Model analysis, and the cybersecurity layer helped mitigate attacks. It is worth noting that Anomaly Detection successfully achieved response times of less than 2.5 seconds, demonstrating that the model can maintain its integrity while providing privacy. Using neutrosophic sim
... Show MoreThe logistic regression model is one of the oldest and most common of the regression models, and it is known as one of the statistical methods used to describe and estimate the relationship between a dependent random variable and explanatory random variables. Several methods are used to estimate this model, including the bootstrap method, which is one of the estimation methods that depend on the principle of sampling with return, and is represented by a sample reshaping that includes (n) of the elements drawn by randomly returning from (N) from the original data, It is a computational method used to determine the measure of accuracy to estimate the statistics, and for this reason, this method was used to find more accurate estimates. The ma
... Show Morethis research aims at a number of objectives including Developing the tax examination process and raise its efficiency without relying on comprehensive examination method using some statistical methods in the tax examination and Discussing the most important concepts related to the statistical methods used in the tax examination and showing its importance and how they are applied. the research represents an applied study in the General Commission of taxes. In order to achieve its objectives the research has used in the theoretical side the descriptive approach (analytical), and in the practical side Some statistical methods applied to the sample of the final accounts for the contracting company (limited) and the pharmaceutical industry (
... Show MoreThe need for detection and investigation of the causes of pollution of the marshes and submit a statistical study evaluated accurately and submitted to the competent authorities and to achieve this goal was used to analyze the factorial analysis and then obtained the results from this analysis from a sample selected from marsh water pollutants which they were: (Electrical Conductivity: EC, Power of Hydrogen: PH, Temperature: T, Turbidity: TU, Total Dissolved Solids: TDS, Dissolved Oxygen: DO). The size of sample (44) sites has been withdrawn and examined in the laboratories of the Iraqi Ministry of Environment. By illustrating SPSS program) the results had been obtained. The most important recommendation was to increase the pumping of addit
... Show More