Cloud-based Electronic Health Records (EHRs) have seen a substantial increase in usage in recent years, especially for remote patient monitoring. Researchers are interested in investigating the use of Healthcare 4.0 in smart cities. This involves using Internet of Things (IoT) devices and cloud computing to remotely access medical processes. Healthcare 4.0 focuses on the systematic gathering, merging, transmission, sharing, and retention of medical information at regular intervals. Protecting the confidential and private information of patients presents several challenges in terms of thwarting illegal intrusion by hackers. Therefore, it is essential to prioritize the protection of patient medical data that is stored, accessed, and shared on the cloud to avoid unauthorized access or compromise by the authorized components of E-healthcare systems. A multitude of cryptographic methodologies have been devised to offer safe storage, exchange, and access to medical data in cloud service provider (CSP) environments. Traditional methods have not been effective in providing a harmonious integration of the essential components for EHR security solutions, such as efficient computing, verification on the service side, verification on the user side, independence from a trusted third party, and strong security. Recently, there has been a lot of interest in security solutions that are based on blockchain technology. These solutions are highly effective in safeguarding data storage and exchange while using little computational resources. The researchers focused their efforts exclusively on blockchain technology, namely on Bitcoin. The present emphasis has been on the secure management of healthcare records through the utilization of blockchain technology. This study offers a thorough examination of modern blockchain-based methods for protecting medical data, regardless of whether cloud computing is utilized or not. This study utilizes and evaluates several strategies that make use of blockchain. The study presents a comprehensive analysis of research gaps, issues, and a future roadmap that contributes to the progress of new Healthcare 4.0 technologies, as demonstrated by research investigations.
The agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation plat
... Show MoreThis paper presents a combination of enhancement techniques for fingerprint images affected by different type of noise. These techniques were applied to improve image quality and come up with an acceptable image contrast. The proposed method included five different enhancement techniques: Normalization, Histogram Equalization, Binarization, Skeletonization and Fusion. The Normalization process standardized the pixel intensity which facilitated the processing of subsequent image enhancement stages. Subsequently, the Histogram Equalization technique increased the contrast of the images. Furthermore, the Binarization and Skeletonization techniques were implemented to differentiate between the ridge and valley structures and to obtain one
... Show MoreThis study investigated the ability of using crushed glass solid wastes in water filtration by using a pilot plant, constructed in Al-Wathba water treatment plant in Baghdad. Different depths and different grain sizes of crushed glass were used as mono and dual media with sand and porcelaniate in the filtration process. The mathematical model by Tufenkji and Elimelech was used to evaluate the initial collection efficiency η of these filters. The results indicated that the collection efficiency varied inversely with the filtration rate. For the mono media filters the theoretical ηth values were more than the practical values ηprac calculated from the experimental work. In the glass filter ηprac was obtained by multiplying ηth by a facto
... Show MoreWith the growth of mobile phones, short message service (SMS) became an essential text communication service. However, the low cost and ease use of SMS led to an increase in SMS Spam. In this paper, the characteristics of SMS spam has studied and a set of features has introduced to get rid of SMS spam. In addition, the problem of SMS spam detection was addressed as a clustering analysis that requires a metaheuristic algorithm to find the clustering structures. Three differential evolution variants viz DE/rand/1, jDE/rand/1, jDE/best/1, are adopted for solving the SMS spam problem. Experimental results illustrate that the jDE/best/1 produces best results over other variants in terms of accuracy, false-positive rate and false-negative
... Show MoreIn this study, we have created a new Arabic dataset annotated according to Ekman’s basic emotions (Anger, Disgust, Fear, Happiness, Sadness and Surprise). This dataset is composed from Facebook posts written in the Iraqi dialect. We evaluated the quality of this dataset using four external judges which resulted in an average inter-annotation agreement of 0.751. Then we explored six different supervised machine learning methods to test the new dataset. We used Weka standard classifiers ZeroR, J48, Naïve Bayes, Multinomial Naïve Bayes for Text, and SMO. We also used a further compression-based classifier called PPM not included in Weka. Our study reveals that the PPM classifier significantly outperforms other classifiers such as SVM and N
... Show MoreIn this paper we estimate the coefficients and scale parameter in linear regression model depending on the residuals are of type 1 of extreme value distribution for the largest values . This can be regard as an improvement for the studies with the smallest values . We study two estimation methods ( OLS & MLE ) where we resort to Newton – Raphson (NR) and Fisher Scoring methods to get MLE estimate because the difficulty of using the usual approach with MLE . The relative efficiency criterion is considered beside to the statistical inference procedures for the extreme value regression model of type 1 for largest values . Confidence interval , hypothesis testing for both scale parameter and regression coefficients
... Show More