The demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB84 detection setup. The second experiment represents an accurate method for estimating the value of µ because of using single photon detectors with high timing resolution and low dark counts, in addition to using a Time-to-digital convertor with a bin size of 81 ps.
The first aim of this paper was to evaluate the push-out bond strength of the gutta-percha coating of Thermafil and GuttaCore and compare it with that of gutta-percha used to coat an experimental hydroxyapatite/polyethylene (HA/PE) obturator. The second aim was to assess the thickness of gutta-percha around the carriers of GuttaCore and HA/PE obturators using microcomputed tomography (
Confocal microscope imaging has become popular in biotechnology labs. Confocal imaging technology utilizes fluorescence optics, where laser light is focused onto a specific spot at a defined depth in the sample. A considerable number of images are produced regularly during the process of research. These images require methods of unbiased quantification to have meaningful analyses. Increasing efforts to tie reimbursement to outcomes will likely increase the need for objective data in analyzing confocal microscope images in the coming years. Utilizing visual quantification methods to quantify confocal images with naked human eyes is an essential but often underreported outcome measure due to the time required for manual counting and e
... Show MoreComputer models are used in the study of electrocardiography to provide insight into physiological phenomena that are difficult to measure in the lab or in a clinical environment.
The electrocardiogram is an important tool for the clinician in that it changes characteristically in a number of pathological conditions. Many illnesses can be detected by this measurement. By simulating the electrical activity of the heart one obtains a quantitative relationship between the electrocardiogram and different anomalies.
Because of the inhomogeneous fibrous structure of the heart and the irregular geometries of the body, finite element method is used for studying the electrical properties of the heart.
This work describes t
... Show MoreAmong more than 200 different human papilloma viral genotypes, the association of low oncogenic risk-HPV genotypes have been recognized with a variety of oral, oropharyngeal, nasopharyngeal benign tumors as well as non-neoplastic polyposis and papillomas and adenoid hypertrophy. This prospective case- control study aims to determine the rate of DNA detection of HPV genotype 6/11 in nasopharyngeal adeno- tonsillar tissues from a group of patients subjected to adenoctomy for adenoid hypertrophy . A total number of nasopharyngeal adeno-tonsillar tissue specimens from pediatric patients with adenoid hypertrophy were enrolled; 40 nasopharyngeal adeno-tonsillar tissues from patients with adenoid hypertrophy, and 20 normal nasal tissue specimen
... Show MoreThe acceptance sampling plans for generalized exponential distribution, when life time experiment is truncated at a pre-determined time are provided in this article. The two parameters (α, λ), (Scale parameters and Shape parameters) are estimated by LSE, WLSE and the Best Estimator’s for various samples sizes are used to find the ratio of true mean time to a pre-determined, and are used to find the smallest possible sample size required to ensure the producer’s risks, with a pre-fixed probability (1 - P*). The result of estimations and of sampling plans is provided in tables.
Key words: Generalized Exponential Distribution, Acceptance Sampling Plan, and Consumer’s and Producer Risks
... Show More
The objective of this study was tointroduce a recursive least squares (RLS) parameter estimatorenhanced by using a neural network (NN) to facilitate the computing of a bit error rate (BER) (error reduction) during channels estimation of a multiple input-multiple output orthogonal frequency division multiplexing (MIMO-OFDM) system over a Rayleigh multipath fading channel.Recursive least square is an efficient approach to neural network training:first, the neural network estimator learns to adapt to the channel variations then it estimates the channel frequency response. Simulation results show that the proposed method has better performance compared to the conventional methods least square (LS) and the original RLS and it is more robust a
... Show MoreIn this paper, wavelets were used to study the multivariate fractional Brownian motion through the deviations of the random process to find an efficient estimation of Hurst exponent. The results of simulations experiments were shown that the performance of the proposed estimator was efficient. The estimation process was made by taking advantage of the detail coefficients stationarity from the wavelet transform, as the variance of this coefficient showed the power-low behavior. We use two wavelet filters (Haar and db5) to manage minimizing the mean square error of the model.