As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based quantum communications.
The messages are ancient method to exchange information between peoples. It had many ways to send it with some security.
Encryption and steganography was oldest ways to message security, but there are still many problems in key generation, key distribution, suitable cover image and others. In this paper we present proposed algorithm to exchange security message without any encryption, or image as cover to hidden. Our proposed algorithm depends on two copies of the same collection images set (CIS), one in sender side and other in receiver side which always exchange message between them.
To send any message text the sender converts message to ASCII c
... Show MoreFinding similarities in texts is important in many areas such as information retrieval, automated article scoring, and short answer categorization. Evaluating short answers is not an easy task due to differences in natural language. Methods for calculating the similarity between texts depend on semantic or grammatical aspects. This paper discusses a method for evaluating short answers using semantic networks to represent the typical (correct) answer and students' answers. The semantic network of nodes and relationships represents the text (answers). Moreover, grammatical aspects are found by measuring the similarity of parts of speech between the answers. In addition, finding hierarchical relationships between nodes in netwo
... Show MoreDigital forensic is part of forensic science that implicitly covers crime related to computer and other digital devices. It‟s being for a while that academic studies are interested in digital forensics. The researchers aim to find out a discipline based on scientific structures that defines a model reflecting their observations. This paper suggests a model to improve the whole investigation process and obtaining an accurate and complete evidence and adopts securing the digital evidence by cryptography algorithms presenting a reliable evidence in a court of law. This paper presents the main and basic concepts of the frameworks and models used in digital forensics investigation.
The corrosion of metals is of great economic importance. Estimates show that the quarter of the iron and the steel produced is destroyed in this way. Rubber lining has been used for severe corrosion protection because NR and certain synthetic rubbers have a basic resistance to the very corrosive chemicals particularly acids. The present work includes producing ebonite from both natural and synthetic rubbers ; therefore, the following materials were chosen to produce ebonite rubber: a) Natural rubber (NR). b) Styrene butadiene rubber (SBR). c) Nitrile rubber (NBR). d) Neoprene rubber (CR) [WRT]. The best ebonite vulcanizates are obtained in the presence of 30 Pphr sulfur, and carbon black as reinforcing filler. The relation between
... Show MoreMerging images is one of the most important technologies in remote sensing applications and geographic information systems. In this study, a simulation process using a camera for fused images by using resizing image for interpolation methods (nearest, bilinear and bicubic). Statistical techniques have been used as an efficient merging technique in the images integration process employing different models namely Local Mean Matching (LMM) and Regression Variable Substitution (RVS), and apply spatial frequency techniques include high pass filter additive method (HPFA). Thus, in the current research, statistical measures have been used to check the quality of the merged images. This has been carried out by calculating the correlation a
... Show MoreThe transition of customers from one telecom operator to another has a direct impact on the company's growth and revenue. Traditional classification algorithms fail to predict churn effectively. This research introduces a deep learning model for predicting customers planning to leave to another operator. The model works on a high-dimensional large-scale data set. The performance of the model was measured against other classification algorithms, such as Gaussian NB, Random Forrest, and Decision Tree in predicting churn. The evaluation was performed based on accuracy, precision, recall, F-measure, Area Under Curve (AUC), and Receiver Operating Characteristic (ROC) Curve. The proposed deep learning model performs better than othe
... Show MoreToday with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
In this research the natural frequency of a cracked simple supported beam (the crack is in many places and in different depths) is investigated analytically, experimentally and numerically by ANSYS program, and the results are compared. The beam is made of iron with dimensions of L*W*H= (0.84*0.02* 0.02m), and density = 7680kg/m3, E=200Gpa. A comparison made between analytical results from ANSYS with experimental results, where the biggest error percentage is about (7.2 %) in crack position (42 cm) and (6 mm) depth. Between Rayleigh method with experimental results the biggest error percentage is about (6.4 %) for the same crack position and depth. From the error percentages it could be concluded that the Rayleigh method gives
... Show MoreObjectives: To find out the effect of l-hydroxyphenazine (1-HP) on viability of T-lymphocytes and the reflects of this
effect on experimental hyadatidosis on hydatid cyst protoscoleces infectivity in vivo.
Methodology: Four groups of white male /ه/mice were experimentally infected with four concentrations of (1-HP)
with challenge dose of 2000 protoscoleces /1 ml with negative (9.8.5) and positive (P.H.A) control groups.
Results: It has been found that the higher concentrations (75,100) 1101/111 of the (1-HP) causes significant
decrement in the lymphocytes viability in comparison with negative and positive control groups. (060.01).
Recommendations: The study recommended using concentrations lower than 25 pmole Iml which
Active learning is a teaching method that involves students actively participating in activities, exercises, and projects within a rich and diverse educational environment. The teacher plays a role in encouraging students to take responsibility for their own education under their scientific and pedagogical supervision and motivates them to achieve ambitious educational goals that focus on developing an integrated personality for today’s students and tomorrow’s leaders. It is important to understand the impact of two proposed strategies based on active learning on the academic performance of first-class intermediate students in computer subjects and their social intelligence. The research sample was intentionally selected, consis
... Show More