The dramatic decrease in the cost of genome sequencing over the last two decades has led to an abundance of genomic data. This data has been used in research related to the discovery of genetic diseases and the production of medicines. At the same time, the huge space for storing the genome (2–3 GB) has led to it being considered one of the most important sources of big data, which has prompted research centers concerned with genetic research to take advantage of the cloud and its services in storing and managing this data. The cloud is a shared storage environment, which makes data stored in it vulnerable to unwanted tampering or disclosure. This leads to serious concerns about securing such data from tampering and unauthorized searches by those involved. In addition to securing inquiries, making calculations on this data, and generating differential privacy and garbled circuits, cryptography is considered one of the important solutions to this problem. This paper introduces most of the important challenges related to maintaining privacy and security and classifies each problem with appropriate, proposed, or applied solutions that will fuel researchers' future interest in developing more effective privacy-preserving methods for genomic data.
In this study, we present different methods of estimating fuzzy reliability of a two-parameter Rayleigh distribution via the maximum likelihood estimator, median first-order statistics estimator, quartile estimator, L-moment estimator, and mixed Thompson-type estimator. The mean-square error MSE as a measurement for comparing the considered methods using simulation through different values for the parameters and unalike sample sizes is used. The results of simulation show that the fuzziness values are better than the real values for all sample sizes, as well as the fuzzy reliability at the estimation of the Maximum likelihood Method, and Mixed Thompson Method perform better than the other methods in the sense of MSE, so that
... Show MoreIn the current study, 2D seismic data in west An-Najaf (WN-36 line) were received after many steps of processing by Oil Exploration Company in 2018. Surface Consistent Amplitude Compensation (SCAC) was applied on the seismic data. The processing sequence in our study started by sorting data in a common mid-point (CMP) gather, in order to apply the velocity analysis using Interactive Velocity Analysis Application (INVA) with Omega system. Semblance of velocity was prepared to preform normal move-out (NMO) vs. Time. Accurate root mean square velocity (VRMS) was selected, which was controlled by flatness of the primary events. The resultant seismic velocity section for the study area shows that the veloci
... Show More3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D mo
... Show MoreThe internet of medical things (IoMT), which is expected the lead to the biggest technology in worldwide distribution. Using 5th generation (5G) transmission, market possibilities and hazards related to IoMT are improved and detected. This framework describes a strategy for proactively addressing worries and offering a forum to promote development, alter attitudes and maintain people's confidence in the broader healthcare system without compromising security. It is combined with a data offloading system to speed up the transmission of medical data and improved the quality of service (QoS). As a result of this development, we suggested the enriched energy efficient fuzzy (EEEF) data offloading technique to enhance the delivery of dat
... Show MoreThe modern systems that have been based upon the hash function are more suitable compared to the conventional systems; however, the complicated algorithms for the generation of the invertible functions have a high level of time consumption. With the use of the GAs, the key strength is enhanced, which results in ultimately making the entire algorithm sufficient. Initially, the process of the key generation is performed by using the results of n-queen problem that is solved by the genetic algorithm, with the use of a random number generator and through the application of the GA operations. Ultimately, the encryption of the data is performed with the use of the Modified Reverse Encryption Algorithm (MREA). It was noticed that the
... Show MoreIn recent years, due to the economic benefits and technical advances of cloud
computing, huge amounts of data have been outsourced in the cloud. To protect the
privacy of their sensitive data, data owners have to encrypt their data prior
outsourcing it to the untrusted cloud servers. To facilitate searching over encrypted
data, several approaches have been provided. However, the majority of these
approaches handle Boolean search but not ranked search; a widely accepted
technique in the current information retrieval (IR) systems to retrieve only the top–k
relevant files. In this paper, propose a distributed secure ranked search scheme over
the encrypted cloud servers. Such scheme allows for the authorized user to
Background: Medication reconciliation can include medication reviewing and providing counseling and a list of all the medications during every transition of care. Objectives: to explore in-depth the perspectives of Iraqi physicians and pharmacists regarding the necessity of medication reconciliation at hospital discharge and identify the possible benefits and challenges that could face its implementation. Subjects and Methods: A qualitative study included semi-structured interviews with pharmacists and physicians working at a public teaching hospital in Iraq. The interviews were conducted face-to-face from February to March 2023. Thematic analysis was used to analyze the qualitative data generated from the interviews. Results: In th
... Show MoreIn this research, the nonparametric technique has been presented to estimate the time-varying coefficients functions for the longitudinal balanced data that characterized by observations obtained through (n) from the independent subjects, each one of them is measured repeatedly by group of specific time points (m). Although the measurements are independent among the different subjects; they are mostly connected within each subject and the applied techniques is the Local Linear kernel LLPK technique. To avoid the problems of dimensionality, and thick computation, the two-steps method has been used to estimate the coefficients functions by using the two former technique. Since, the two-
... Show MoreThe adsorption of Cr (VI) from aqueous solution by spent tea leaves (STL) was studied at different initial Cr (VI) concentrations, adsorbent dose, pH and contact time under batch isotherm experiments The adsorption experiments were carried out at 30°C and the effects of the four parameters on chromium uptake to establish a mathematical model description percentage removal of Cr (VI). The
analysis results showed that the experimental data were adequately fitted to second order polynomial model with correlation coefficients for this model was (R2 = 0.9891). The optimum operating parameters of initial Cr (VI) concentrations, adsorbent dose, pH and contact time were 50 mg/l, 0.7625 g, 3 and 100 min, respectively. At these conditions, th
This paper is concerned with the blow-up solutions of a system of two reaction-diffusion equations coupled in both equations and boundary conditions. In order to understand how the reaction terms and the boundary terms affect the blow-up properties, the lower and upper blow-up rate estimates are derived. Moreover, the blow-up set under some restricted assumptions is studied.