A medical- service platform is a mobile application through which patients are provided with doctor’s diagnoses based on information gleaned from medical images. The content of these diagnostic results must not be illegitimately altered during transmission and must be returned to the correct patient. In this paper, we present a solution to these problems using blind, reversible, and fragile watermarking based on authentication of the host image. In our proposed algorithm, the binary version of the Bose_Chaudhuri_Hocquengham (BCH) code for patient medical report (PMR) and binary patient medical image (PMI) after fuzzy exclusive or (F-XoR) are used to produce the patient's unique mark using secret sharing schema (SSS). The patient’s unique mark is used later as a watermark to be embedded into host PMI using blind watermarking-based singular value decomposition (SVD) algorithm. This is a new solution that we also proposed to applying SVD into a blind watermarking image. Our algorithm preserves PMI content authentication during the transmission and PMR ownership to the patient for subsequently transmitting associated diagnosis to the correct patient via a mobile telemedicine application. The performance of experimental results is high compare to previous results, uses recovered watermarks demonstrating promising results in the tamper detection metrics and self-recovery capability, with 30db PSNR, NC value is 0.99.
XML is being incorporated into the foundation of E-business data applications. This paper addresses the problem of the freeform information that stored in any organization and how XML with using this new approach will make the operation of the search very efficient and time consuming. This paper introduces new solution and methodology that has been developed to capture and manage such unstructured freeform information (multi information) depending on the use of XML schema technologies, neural network idea and object oriented relational database, in order to provide a practical solution for efficiently management multi freeform information system.
Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreABSTRACT Background: The Iraqi hospital witnessed numerous violence incidents against medical staff working in emergency department and range from verbal to physical violence. High frequency of these attacks urged the Iraqi doctors for migration. Aim of study: To identify the prevalence of workplace violence against medical staff and to and study the risk factors related to work place violence. Materials and methods: A descriptive cross sectional study carried out among a sample of 300 medical
This paper analyses the relationship between selected macroeconomic variables and gross domestic product (GDP) in Saudi Arabia for the period 1993-2019. Specifically, it measures the effects of interest rate, oil price, inflation rate, budget deficit and money supply on the GDP of Saudi Arabia. The method employs in this paper is based on a descriptive analysis approach and ARDL model through the Bounds testing approach to cointegration. The results of the research reveal that the budget deficit, oil price and money supply have positive significant effects on GDP, while other variables have no effects on GDP and turned out to be insignificant. The findings suggest that both fiscal and monetary policies should be fo
... Show MoreResearch aimed to explore the Application Effect of the Conflict Management Strategies by the managements to solve conflict between and inside the conflicted parties within (IGEC) to increase the productivity of the workers. To collect data, 110 questioners had been distributed among managers and heads of departments of all managerial levels, 102 answered questioners regained, 5 of them were disqualify for statistical analytic, only 97 were taking in consideration for statistical analysis presenting 93% of the retained number.
SPSS Program supported with a group of statistical tools, had been used for analysis purposes such as Kronbach Alpha test to assure the validity & stability of the t
... Show MoreDigital tampering identification, which detects picture modification, is a significant area of image analysis studies. This area has grown with time with exceptional precision employing machine learning and deep learning-based strategies during the last five years. Synthesis and reinforcement-based learning techniques must now evolve to keep with the research. However, before doing any experimentation, a scientist must first comprehend the current state of the art in that domain. Diverse paths, associated outcomes, and analysis lay the groundwork for successful experimentation and superior results. Before starting with experiments, universal image forensics approaches must be thoroughly researched. As a result, this review of variou
... Show MoreIn this research we study a variance component model, Which is the one of the most important models widely used in the analysis of the data, this model is one type of a multilevel models, and it is considered as linear models , there are three types of linear variance component models ,Fixed effect of linear variance component model, Random effect of linear variance component model and Mixed effect of linear variance component model . In this paper we will examine the model of mixed effect of linear variance component model with one –way random effect ,and the mixed model is a mixture of fixed effect and random effect in the same model, where it contains the parameter (μ) and treatment effect (τi ) which has
... Show MoreFace recognition is a crucial biometric technology used in various security and identification applications. Ensuring accuracy and reliability in facial recognition systems requires robust feature extraction and secure processing methods. This study presents an accurate facial recognition model using a feature extraction approach within a cloud environment. First, the facial images undergo preprocessing, including grayscale conversion, histogram equalization, Viola-Jones face detection, and resizing. Then, features are extracted using a hybrid approach that combines Linear Discriminant Analysis (LDA) and Gray-Level Co-occurrence Matrix (GLCM). The extracted features are encrypted using the Data Encryption Standard (DES) for security
... Show MoreThe research acquires its importance by motivating the behavioural side of the employees to apply modern technology in the work, because of its great importance in increasing the efficiency of employees’ performance and excellence. The research was based on two main hypotheses to show the relationship and impact between the variables through the adoption of a questionnaire to collect data and information related to the research, which consisted of (50) people from administrators working at different levels, based on personal interviews and field visits to collect research data. The data collection process was subjected to statistical analysis using the statistical program (SPSS) (Statistical package for social science) to reach
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show More