The demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB84 detection setup. The second experiment represents an accurate method for estimating the value of µ because of using single photon detectors with high timing resolution and low dark counts, in addition to using a Time-to-digital convertor with a bin size of 81 ps.
In this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
The two-neutron halo-nuclei (17B, 11Li, 8He) was investigated using a two-body nucleon density distribution (2BNDD) with two frequency shell model (TFSM). The structure of valence two-neutron of 17B nucleus in a pure (1d5/2) state and in a pure (1p1/2) state for 11L and 8He nuclei. For our tested nucleus, an efficient (2BNDD's) operator for point nucleon system folded with two-body correlation operator's functions was used to investigate nuclear matter density distributions, root-mean square (rms) radii, and elastic electron scattering form factors. In the nucleon-nucleon forces the correlation took account of
... Show MoreIn this research , we study the inverse Gompertz distribution (IG) and estimate the survival function of the distribution , and the survival function was evaluated using three methods (the Maximum likelihood, least squares, and percentiles estimators) and choosing the best method estimation ,as it was found that the best method for estimating the survival function is the squares-least method because it has the lowest IMSE and for all sample sizes
This research deals with a shrinking method concernes with the principal components similar to that one which used in the multiple regression “Least Absolute Shrinkage and Selection: LASS”. The goal here is to make an uncorrelated linear combinations from only a subset of explanatory variables that may have a multicollinearity problem instead taking the whole number say, (K) of them. This shrinkage will force some coefficients to equal zero, after making some restriction on them by some "tuning parameter" say, (t) which balances the bias and variance amount from side, and doesn't exceed the acceptable percent explained variance of these components. This had been shown by MSE criterion in the regression case and the percent explained v
... Show MoreIn this work preparation of antireflection coating with single layer of MgO using pulsed laser deposition (PLD) method which deposit on glass substrate with different thicknesses (90 and 100) nm annealed at temperature 500 K was done.
The optical and structural properties (X-ray diffraction) have been determined. The optical reflectance was computed with the aid of MATLAB over the visible and near infrared region. Results shows that the best result obtained for optical performance of AR'Cs at 700 shots with thickness 90 nm nanostructure single layer AR'Cs and low reflection at wavelength 550 nm.
Zernike Moments has been popularly used in many shape-based image retrieval studies due to its powerful shape representation. However its strength and weaknesses have not been clearly highlighted in the previous studies. Thus, its powerful shape representation could not be fully utilized. In this paper, a method to fully capture the shape representation properties of Zernike Moments is implemented and tested on a single object for binary and grey level images. The proposed method works by determining the boundary of the shape object and then resizing the object shape to the boundary of the image. Three case studies were made. Case 1 is the Zernike Moments implementation on the original shape object image. In Case 2, the centroid of the s
... Show MoreHigh frequency (HF) communications have an important role in long distances wireless communications. This frequency band is more important than VHF and UHF, as HF frequencies can cut longer distance with a single hopping. It has a low operation cost because it offers over-the-horizon communications without repeaters, therefore it can be used as a backup for satellite communications in emergency conditions. One of the main problems in HF communications is the prediction of the propagation direction and the frequency of optimum transmission (FOT) that must be used at a certain time. This paper introduces a new technique based on Oblique Ionosonde Station (OIS) to overcome this problem with a low cost and an easier way. This technique uses the
... Show MoreSingle Point Incremental Forming (SPIF) is a forming technique of sheet material based on layered manufacturing principles. The sheet part is locally deformed through horizontal slices. The moving locus of forming tool (called as toolpath) in these slices constructed to the finished part was performed by the CNC technology. The toolpath was created directly from CAD model of final product. The forming tool is a Ball-end forming tool, which was moved along the toolpath while the edges of sheet material were clamped rigidly on fixture.
This paper presented an investigation study of thinning distribution of a conical shapes carried out by incremental forming and the validation of finite element method to evaluate the limits of the p
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show More