Password authentication is popular approach to the system security and it is also very important system security procedure to gain access to resources of the user. This paper description password authentication method by using Modify Bidirectional Associative Memory (MBAM) algorithm for both graphical and textual password for more efficient in speed and accuracy. Among 100 test the accuracy result is 100% for graphical and textual password to authenticate a user.
Introduction: Carrier-based gutta-percha is an effective method of root canal obturation creating a 3-dimensional filling; however, retrieval of the plastic carrier is relatively difficult, particularly with smaller sizes. The purpose of this study was to develop composite carriers consisting of polyethylene (PE), hydroxyapatite (HA), and strontium oxide (SrO) for carrier-based root canal obturation. Methods: Composite fibers of HA, PE, and SrO were fabricated in the shape of a carrier for delivering gutta-percha (GP) using a melt-extrusion process. The fibers were characterized using infrared spectroscopy and the thermal properties determined using differential scanning calorimetry. The elastic modulus and tensile strength tests were dete
... Show MoreThe UN organization is considered one of the most important organizations at the international level. It has accomplished multiple tasks and roles of many different issues and events that hit the developing and advanced world countries. It has performed a series of procedures and laws that have had an impact on ending the wars and conflicts that plagued some countries and continued for a period of time in the past. Moreover, it has improved the level of the international relations between a number of countries due to the problems and incidents took place between them. It has relied on finding solutions and treatments for humanitarian problems such as the preservation of the environment, preventing the spread of epidemics and diseases Thi
... Show MoreAlthough its wide utilization in microbial cultures, the one factor-at-a-time method, failed to find the true optimum, this is due to the interaction between optimized parameters which is not taken into account. Therefore, in order to find the true optimum conditions, it is necessary to repeat the one factor-at-a-time method in many sequential experimental runs, which is extremely time-consuming and expensive for many variables. This work is an attempt to enhance bioactive yellow pigment production by Streptomyces thinghirensis based on a statistical design. The yellow pigment demonstrated inhibitory effects against Escherichia coli and Staphylococcus aureus and was characterized by UV-vis spectroscopy which showed lambda maximum of
... Show MoreIn this work, an enhanced Photonic Crystal Fiber (PCF) based on Surface Plasmon Resonance (SPR) sensor using a sided polished structure for the detection of toxic ions Arsenic in water was designed and implemented. The SPR curve can be obtained by polishing the side of the PCF after coating the Au film on the side of the polished area, the SPR curve can be obtained. The proposed sensor has a clear SPR effect, according to the findings of the experiments. The estimated signal to Noise Ratio (SNR), sensitivity (S), resolution (R), and Figures of merit (FOM) are approaching; the SNR is 0.0125, S is 11.11 μm/RIU, the resolution is 1.8x〖10〗^(-4), and the FOM is 13.88 for Single-mode Fiber- Photonic Crystal Fiber- single mode Fiber (SMF-P
... Show MoreWithin the framework of big data, energy issues are highly significant. Despite the significance of energy, theoretical studies focusing primarily on the issue of energy within big data analytics in relation to computational intelligent algorithms are scarce. The purpose of this study is to explore the theoretical aspects of energy issues in big data analytics in relation to computational intelligent algorithms since this is critical in exploring the emperica aspects of big data. In this chapter, we present a theoretical study of energy issues related to applications of computational intelligent algorithms in big data analytics. This work highlights that big data analytics using computational intelligent algorithms generates a very high amo
... Show MoreThis paper aims to build a modern vision for Islamic banks to ensure sustainability and growth, as well it aims to highlight the positive Iraqi steps in the Islamic banking sector. In order to build this vision, several scientific research approaches were adopted (quantitative, descriptive analytical, descriptive). As for the research community, it was for all the Iraqi private commercial banks, including Islamic banks. The research samples varied according to a diversity of the methods and the data availability. A questionnaire was constructed and conducted, measuring internal and external honesty. 50 questionnaires were distributed to Iraqi academic specialized in Islamic banking. All distributed forms were subject to a thorough analys
... Show More
The aim of the research is to identify the effect of instructional design according to Kagan structure among the first intermediate school student’s, and how skills could help in generating information in mathematics. In accordance with the research objectives, the researcher has followed the experimental research method by adopting an experimental design with two equivalent groups of post-test to measure skills in generating information. Accordingly, the researcher raised two main null hypotheses: there were no statistically significant differences at the level of significance (0.05) between the average scores of the experimental group who studied the material according to Kagan structure and th
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
There has been a growing interest in the use of chaotic techniques for enabling secure communication in recent years. This need has been motivated by the emergence of a number of wireless services which require the channel to provide low bit error rates (BER) along with information security. The aim of such activity is to steal or distort the information being conveyed. Optical Wireless Systems (basically Free Space Optic Systems, FSO) are no exception to this trend. Thus, there is an urgent necessity to design techniques that can secure privileged information against unauthorized eavesdroppers while simultaneously protecting information against channel-induced perturbations and errors. Conventional cryptographic techniques are not designed
... Show More