Single-photon detection concept is the most crucial factor that determines the performance of quantum key distribution (QKD) systems. In this paper, a simulator with time domain visualizers and configurable parameters using continuous time simulation approach is presented for modeling and investigating the performance of single-photon detectors operating in Gieger mode at the wavelength of 830 nm. The widely used C30921S silicon avalanche photodiode was modeled in terms of avalanche pulse, the effect of experiment conditions such as excess voltage, temperature and average photon number on the photon detection efficiency, dark count rate and afterpulse probability. This work shows a general repeatable modeling process for significant perform
... Show MoreNanofluids are proven to be efficient agents for wettability alteration in subsurface applications including enhanced oil recovery (EOR). Nanofluids can also be used for CO2-storage applications where the CO2-wet rocks can be rendered strongly water-wet, however no attention has been given to this aspect in the past. Thus in this work we presents contact angle (θ) measurements for CO2/brine/calcite system as function of pressure (0.1 MPa, 5 MPa, 10 MPa, 15 MPa, and 20 MPa), temperature (23 °C, 50 °C and 70 °C), and salinity (0, 5, 10, 15, and 20% NaCl) before and after nano-treatment to address the wettability alteration efficiency. Moreover, the effect of treatment pressure and temperature, treatment fluid concentration (SiO2 wt%) and
... Show MoreMassive multiple-input multiple-output (MaMi) systems have attracted much research attention during the last few years. This is because MaMi systems are able to achieve a remarkable improvement in data rate and thus meet the immensely ongoing traffic demands required by the future wireless networks. To date, the downlink training sequence (DTS) for the frequency division duplex (FDD) MaMi communications systems have been designed based on the idealistic assumption of white noise environments. However, it is essential and more practical to consider the colored noise environments when designing an efficient DTS for channel estimation. To this end, this paper proposes a new DTS design by exploring the joint use of spatial channel and n
... Show MoreThe Audit evedances represent the reconciliation tools between the Financial data shown on financial statements, and the level of satisfaction level of the Auditor about these statements. According that, the Auditor try to achieve the highest quantity of These evidances, and the most satisfactive of it…, but that will be so hard sometimes, when the internal controlling system is not good, and when the Auditor had some satisfied evidences, but not sharp… So, this research comes to inspect the relation between the quantity, and the level of satisfaction, and argument to prove that evidences gives. This research assumes that getting enough evidences leads to reduce faults, improves the auditing operation, and avoids risks. The research
... Show MoreFuzzy logic is used to solve the load flow and contingency analysis problems, so decreasing computing time and its the best selection instead of the traditional methods. The proposed method is very accurate with outstanding computation time, which made the fuzzy load flow (FLF) suitable for real time application for small- as well as large-scale power systems. In addition that, the FLF efficiently able to solve load flow problem of ill-conditioned power systems and contingency analysis. The FLF method using Gaussian membership function requires less number of iterations and less computing time than that required in the FLF method using triangular membership function. Using sparsity technique for the input Ybus sparse matrix data gi
... Show MoreNowadays, internet security is a critical concern; the One of the most difficult study issues in network security is "intrusion detection". Fight against external threats. Intrusion detection is a novel method of securing computers and data networks that are already in use. To boost the efficacy of intrusion detection systems, machine learning and deep learning are widely deployed. While work on intrusion detection systems is already underway, based on data mining and machine learning is effective, it requires to detect intrusions by training static batch classifiers regardless considering the time-varying features of a regular data stream. Real-world problems, on the other hand, rarely fit into models that have such constraints. Furthermor
... Show MoreWith the increasing demands to use remote sensing approaches, such as aerial photography, satellite imagery, and LiDAR in archaeological applications, there is still a limited number of studies assessing the differences between remote sensing methods in extracting new archaeological finds. Therefore, this work aims to critically compare two types of fine-scale remotely sensed data: LiDAR and an Unmanned Aerial Vehicle (UAV) derived Structure from Motion (SfM) photogrammetry. To achieve this, aerial imagery and airborne LiDAR datasets of Chun Castle were acquired, processed, analyzed, and interpreted. Chun Castle is one of the most remarkable ancient sites in Cornwall County (Southwest England) that had not been surveyed and explored
... Show MoreBackground: Many types of instruments and techniques are used in the instrumentation of the root canal system. These instruments and techniques may extrude debris beyond the apical foramen and may cause post-instrumentation complications. The aim of this study was to evaluate the amount of apically extruded debris resulted by using 4 types of nickel-titanium instruments (WaveOne, TRUShape 3D conforming files, Hyflex CM, and One Shape files) during endodontic instrumentation. Materials and methods: Forty freshly extracted human mandibular second premolar with straight canals and a single apex were collected for this study. All teeth were cut to similar lengths. Pre-weighted glass vials were used as collecting containers. Samples were randoml
... Show MoreThe region-based association analysis has been proposed to capture the collective behavior of sets of variants by testing the association of each set instead of individual variants with the disease. Such an analysis typically involves a list of unphased multiple-locus genotypes with potentially sparse frequencies in cases and controls. To tackle the problem of the sparse distribution, a two-stage approach was proposed in literature: In the first stage, haplotypes are computationally inferred from genotypes, followed by a haplotype coclassification. In the second stage, the association analysis is performed on the inferred haplotype groups. If a haplotype is unevenly distributed between the case and control samples, this haplotype is labeled
... Show More