The demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB84 detection setup. The second experiment represents an accurate method for estimating the value of µ because of using single photon detectors with high timing resolution and low dark counts, in addition to using a Time-to-digital convertor with a bin size of 81 ps.
Been Antkhav three isolates of soil classified as follows: Bacillus G3 consists of spores, G12, G27 led Pal NTG treatment to kill part of the cells of the three isolates varying degrees treatment also led to mutations urged resistance to streptomycin and rifampicin and double mutations
The primary function of commercial banks is the process of converting liquid liabilities such as deposits to illiquid assets, (also known as a loan), liquid assets, (aka cash and cash equivalent) in a balanced manner between liquid and illiquid assets, that guaranteed the preservation of the rights of depositors and the bank and not by converting liquid liabilities into liquid assets in a very large percentage. This comes from its role as depository and intermediary institutions between supply and demand, therefore, we find that the high indicators of bank liquidity and solvency may reflect a misleading picture of the status of commercial banks, to some extent in terms of the strength of their balance sheets and
... Show MoreIn this paper, we will study non parametric model when the response variable have missing data (non response) in observations it under missing mechanisms MCAR, then we suggest Kernel-Based Non-Parametric Single-Imputation instead of missing value and compare it with Nearest Neighbor Imputation by using the simulation about some difference models and with difference cases as the sample size, variance and rate of missing data.
Graphite Coated Electrodes (GCE) based on molecularly imprinted polymers were fabricated for the selective potentiometric determination of Risperidone (Ris). The molecularly imprinted (MIP) and nonimprinted (NIP) polymers were synthesized by bulk polymerization using (Ris.) as a template, acrylic acid (AA) and acrylamide (AAm) as monomers, ethylene glycol dimethacrylate (EGDMA) as a cross-linker and benzoyl peroxide (BPO) as an initiator. The imprinted membranes and the non-imprinted membranes were prepared using dioctyl phthalate (DOP) and Dibutylphthalate (DBP) as plasticizers in PVC matrix. The membranes were coated on graphite electrodes. The MIP electrodes using
... Show MoreGrabisch and Labreuche have recently proposed a generalization of capacities, called the bi-capacities. Recently, a new approach for studying bi-capacities through introducing a notion of ternary-element sets proposed by the author. In this paper, we propose many results such as bipolar Mobius transform, importance index, and interaction index of bi-capacities based on our approach.
A new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever
... Show MoreImage compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show More