A substantial matter to confidential messages' interchange through the internet is transmission of information safely. For example, digital products' consumers and producers are keen for knowing those products are genuine and must be distinguished from worthless products. Encryption's science can be defined as the technique to embed the data in an images file, audio or videos in a style which should be met the safety requirements. Steganography is a portion of data concealment science that aiming to be reached a coveted security scale in the interchange of private not clear commercial and military data. This research offers a novel technique for steganography based on hiding data inside the clusters that resulted from fuzzy clustering. The approach has been employed to use Fuzzy C-Mean clustering (FCM) to find the robust image regions for hiding the type of skin texture features in mice. The steganography was implemented using Least Significant Bit (LSB) method.
Abstract
Objectives: The main objective of this study is to find the influence level of nursing incivility on psychological well-being among nurses in southeastern Iraq.
Methods: In this descriptive correlational study, a convenience sample of 250 nurses working in three government hospitals in Missan province in the south of Iraq were surveyed using the nursing incivility scale (NIS) and Ryff's psychological well-being scale (PWB) from November 2021, to July 2022. A multivariate multiple regression analysis was done to analyze the multivariate effect of workplace incivility on the psychological well-being of nurses.
Results: The study results show a
... Show MoreThe research aims to identify the effect of the training program that is based on integrating futuristic thinking skills with classroom interaction patterns on mathematics teachers in order to provide their students with creative solution skills. The research sample consisted of 31teachers (15 teachers for the experimental group and 16 for the control groups). The researcher developed a measure for the academic self-efficacy consisting of (39) items. Its validity, reliability, coefficient of difficulty and discriminatory power were estimated. To analyze the findings, the researcher adopted the Mann-Whitney (U) test and the effect size, and the findings were as follows: There is a statistically significant difference at the significance leve
... Show More
Abstract
The net profit reported in the annual financial statements of the companies listed in the financial markets, is considered one of the Sources of information relied upon by users of accounting information in making their investment decisions. At the same time be relied upon in calculating the bonus (Incentives) granted to management, therefore the management of companies to manipulate those numbers in order to increase those bonuses associated to earnings, This practices are called earnings management practices. the manipulation in the figures of earnings by management will mislead the users of financial statements who depend on reported earnings in their deci
... Show MoreAlthough its wide utilization in microbial cultures, the one factor-at-a-time method, failed to find the true optimum, this is due to the interaction between optimized parameters which is not taken into account. Therefore, in order to find the true optimum conditions, it is necessary to repeat the one factor-at-a-time method in many sequential experimental runs, which is extremely time-consuming and expensive for many variables. This work is an attempt to enhance bioactive yellow pigment production by Streptomyces thinghirensis based on a statistical design. The yellow pigment demonstrated inhibitory effects against Escherichia coli and Staphylococcus aureus and was characterized by UV-vis spectroscopy which showed lambda maximum of
... Show MoreIn this article, we developed a new loss function, as the simplification of linear exponential loss function (LINEX) by weighting LINEX function. We derive a scale parameter, reliability and the hazard functions in accordance with upper record values of the Lomax distribution (LD). To study a small sample behavior performance of the proposed loss function using a Monte Carlo simulation, we make a comparison among maximum likelihood estimator, Bayesian estimator by means of LINEX loss function and Bayesian estimator using square error loss (SE) function. The consequences have shown that a modified method is the finest for valuing a scale parameter, reliability and hazard functions.
Nuclear structure of 20,22Ne isotopes has been studied via the shell model with Skyrme-Hartree-Fock calculations. In particular, the transitions to the low-lying positive and negative parity excited states have been investigated within three shell model spaces; sd for positive parity states, spsdpf large-basis (no-core), and zbme model spaces for negative parity states. Excitation energies, reduced transition probabilities, and elastic and inelastic form factors were estimated and compared to the available experimental data. Skyrme interaction was used to generate a one-body potential in the Hartree-Fock calculations for each selected excited states, which is then used to calculate the single-particle matrix elements. Skyrme interac
... Show MoreAbstract
Knowing the amount of residual stresses and find technological solutions to minimize and control them during the production operation are an important task because great levels of deformation which occurs in single point incremental forming (SPIF), this induce highly non-uniform residual stresses. In this papera propose of a method for multilayer single point incremental forming with change in thickness of the top plate (0.5, 0.7, 0.9) mm and lubrication or material between two plates(polymer, grease, grease with graphite, mos2) to knowing an effect of this method and parameters on residual stresses for the bottom plates. Also compare these results for the
... Show MoreAn intelligent software defined network (ISDN) based on an intelligent controller can manage and control the network in a remarkable way. In this article, a methodology is proposed to estimate the packet flow at the sensing plane in the software defined network-Internet of Things based on a partial recurrent spike neural network (PRSNN) congestion controller, to predict the next step ahead of packet flow and thus, reduce the congestion that may occur. That is, the proposed model (spike ISDN-IoT) is enhanced with a congestion controller. This controller works as a proactive controller in the proposed model. In addition, we propose another intelligent clustering controller based on an artificial neural network, which operates as a reactive co
... Show More