The work in this paper focuses on the system quality of direct and coherent communication system for two computers. A system quality is represented by Signal to Noise ratio (SNR) and Bit Error Rate (BER). First part of the work includes implementation of direct optical fiber communication system and measure the system quality .The second part of the work include implementation both the( homodyne and heterodyne)coherent optical fiber communication system and measure the system quality . Laser diode 1310 nm wavelength with its drive circuit used in the transmitter circuit . A single mode of 62.11 km optical fiber is selected as transmission medium . A PIN photo detector is used in the receiver circuit. The optical D-coupler was used to combine the optical signal that come from transmitter laser source with optical signal of laser local oscillator at 1310/1550 nm to obtain coherent detection . Results show that for direct detection the SNR and the BER (28.5 dB, 9.64x10-8,) respectively, while for homodyne and heterodyne coherent detection , the SNR(94.36,97.71)dB and the BER are (1.32x10-22,2.43x10-23) at maximum optical fiber length at 62.11 km. Results show that the homodyne and heterodyne detection are better than direct detection because the large output SNR and low BER of the received signal.
In this paper we study the selection of cognitive elements and criteria of the inflectional structure of the Russian and Arabic languages in the process of speech communication. Phonetic-physiological principle is the main parameter by which the elements and criteria of cognitive activity in the presented study are distinguished. On the basis of the above mentioned parameter, we select the investigated criteria and elements. The first criterion is semantic, reflects the accordance of the elements of thinking to sound combinations in the studied languages, and allows us to distinguish the second criterion – morphonological. The second criterion depends on the phonetic changes of these combinations occurring in the process of speech activit
... Show MoreCanonical correlation analysis is one of the common methods for analyzing data and know the relationship between two sets of variables under study, as it depends on the process of analyzing the variance matrix or the correlation matrix. Researchers resort to the use of many methods to estimate canonical correlation (CC); some are biased for outliers, and others are resistant to those values; in addition, there are standards that check the efficiency of estimation methods.
In our research, we dealt with robust estimation methods that depend on the correlation matrix in the analysis process to obtain a robust canonical correlation coefficient, which is the method of Biwe
... Show MoreSome experiments need to know the extent of their usefulness to continue providing them or not. This is done through the fuzzy regression discontinuous model, where the Epanechnikov Kernel and Triangular Kernel were used to estimate the model by generating data from the Monte Carlo experiment and comparing the results obtained. It was found that the. Epanechnikov Kernel has a least mean squared error.
Passes the Arab order moment political precision of disruption and discord and
differences of countless between its components and its parts because of the suffering he
endured and the suffering now from internal disturbances as a result of lack of cohesion
relations intra-and cultural, historical and lack of interaction between these components so
that became the focus of the policies and interests is an unprecedented degree , reduced with
the joint Arab action to the minimum, and began to focus on the interests of special
regulations.
This is with regard suffering internally, but externally the regime Arab Regional is absent the
biggest influence on international decisions because of the courtesies which were ca
The present study aims at investigating classroom verbal and nonverbal communication at the departments of English language . An observation checklist has been constructed , which is distributed into several domains that include a number of items to investigate classroom communication . Face validity and reliability coefficient have been computed. The checklist has been applied on 86 instructors at the Colleges of Education and Arts, Departments of English Language at the Universities of ThiQar, Basrah ,and Maysan . One sample t- test and Two independent sample t-test formulas have been used. Final results reveal that college instructors use verbal communication inside their classrooms and non- verbal communication has not been employed by
... Show MoreThe transfer function model the basic concepts in the time series. This model is used in the case of multivariate time series. As for the design of this model, it depends on the available data in the time series and other information in the series so when the representation of the transfer function model depends on the representation of the data In this research, the transfer function has been estimated using the style nonparametric represented in two method local linear regression and cubic smoothing spline method The method of semi-parametric represented use semiparametric single index model, With four proposals, , That the goal of this research is comparing the capabilities of the above mentioned m
... Show MoreThe partial level density PLD of pre-equilibrium reactions that are described by Ericson’s formula has been studied using different formulae of single particle level density . The parameter was used from the equidistant spacing model (ESM) model and the non- equidistant spacing model (non-ESM) and another formula of are derived from the relation between and level density parameter . The formulae used to derive are the Roher formula, Egidy formula, Yukawa formula, and Thomas –Fermi formula. The partial level density results that depend on from the Thomas-Fermi formula show a good agreement with the experimental data.
Encryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show More