Cloud-based Electronic Health Records (EHRs) have seen a substantial increase in usage in recent years, especially for remote patient monitoring. Researchers are interested in investigating the use of Healthcare 4.0 in smart cities. This involves using Internet of Things (IoT) devices and cloud computing to remotely access medical processes. Healthcare 4.0 focuses on the systematic gathering, merging, transmission, sharing, and retention of medical information at regular intervals. Protecting the confidential and private information of patients presents several challenges in terms of thwarting illegal intrusion by hackers. Therefore, it is essential to prioritize the protection of patient medical data that is stored, accessed, and shared on the cloud to avoid unauthorized access or compromise by the authorized components of E-healthcare systems. A multitude of cryptographic methodologies have been devised to offer safe storage, exchange, and access to medical data in cloud service provider (CSP) environments. Traditional methods have not been effective in providing a harmonious integration of the essential components for EHR security solutions, such as efficient computing, verification on the service side, verification on the user side, independence from a trusted third party, and strong security. Recently, there has been a lot of interest in security solutions that are based on blockchain technology. These solutions are highly effective in safeguarding data storage and exchange while using little computational resources. The researchers focused their efforts exclusively on blockchain technology, namely on Bitcoin. The present emphasis has been on the secure management of healthcare records through the utilization of blockchain technology. This study offers a thorough examination of modern blockchain-based methods for protecting medical data, regardless of whether cloud computing is utilized or not. This study utilizes and evaluates several strategies that make use of blockchain. The study presents a comprehensive analysis of research gaps, issues, and a future roadmap that contributes to the progress of new Healthcare 4.0 technologies, as demonstrated by research investigations.
Surgical site infections are the second most common type of adverse events occurring in hospitalized patients. Surgical antibiotic prophylaxis refers to the use of preoperative and postoperative antibiotics to decrease the incidence of postoperative wound infections. The objective of this study was to evaluate the antibiotic administration pattern for surgical antibiotic prophylaxis and the adherence to American Society of Health-System Pharmacists surgical antibiotic prophylaxis guideline in Medical City Teaching Hospitals/Baghdad. The medical records of one hundred patients who underwent elective surgical procedures were reviewed. Adherence to the recommendations of American society of health‑system pharmacists guideline was ass
... Show MoreToday, the role of cloud computing in our day-to-day lives is very prominent. The cloud computing paradigm makes it possible to provide demand-based resources. Cloud computing has changed the way that organizations manage resources due to their robustness, low cost, and pervasive nature. Data security is usually realized using different methods such as encryption. However, the privacy of data is another important challenge that should be considered when transporting, storing, and analyzing data in the public cloud. In this paper, a new method is proposed to track malicious users who use their private key to decrypt data in a system, share it with others and cause system information leakage. Security policies are also considered to be int
... Show MoreAnomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreObjectives: The study objectives are to determine the impact of education program upon the academic nurses'
practice concerning documentation of nursing sheets, and to find out the relationship between nurses knowledge
and their demographic characteristics, which include age, sex, and years of experience in medical and surgical
wards.
Methodology: A quasi- experimental study was carried out at the medical and surgical wards in teaching
hospitals in Sulaimani governorate from the beginning of March up to June 2007٠
To reach the objectives of the study anon-probability (purposive) sample of (25) academic nurses who work in
the medical and surgical wards in teaching hospitals.
The data were collected through the use
The research aimed at measuring the compatibility of Big date with the organizational Ambidexterity dimensions of the Asia cell Mobile telecommunications company in Iraq in order to determine the possibility of adoption of Big data Triple as a approach to achieve organizational Ambidexterity.
The study adopted the descriptive analytical approach to collect and analyze the data collected by the questionnaire tool developed on the Likert scale After a comprehensive review of the literature related to the two basic study dimensions, the data has been subjected to many statistical treatments in accordance with res
... Show MoreThe Estimation Of The Reliability Function Depends On The Accuracy Of The Data Used To Estimate The Parameters Of The Probability distribution, and Because Some Data Suffer from a Skew in their Data to Estimate the Parameters and Calculate the Reliability Function in light of the Presence of Some Skew in the Data, there must be a Distribution that has flexibility in dealing with that Data. As in the data of Diyala Company for Electrical Industries, as it was observed that there was a positive twisting in the data collected from the Power and Machinery Department, which required distribution that deals with those data and searches for methods that accommodate this problem and lead to accurate estimates of the reliability function,
... Show MoreThe most significant function in oil exploration is determining the reservoir facies, which are based mostly on the primary features of rocks. Porosity, water saturation, and shale volume as well as sonic log and Bulk density are the types of input data utilized in Interactive Petrophysics software to compute rock facies. These data are used to create 15 clusters and four groups of rock facies. Furthermore, the accurate matching between core and well-log data is established by the neural network technique. In the current study, to evaluate the applicability of the cluster analysis approach, the result of rock facies from 29 wells derived from cluster analysis were utilized to redistribute the petrophysical properties for six units of Mishri
... Show MoreA resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Theref
... Show More