Ketoprofen has recently been proven to offer therapeutic potential in preventing cancers such as colorectal and lung tumors, as well as in treating neurological illnesses. The goal of this review is to show the methods that have been used for determining ketoprofen in pharmaceutical formulations. Precision product quality control is crucial to confirm the composition of the drugs in pharmaceutical use. Several analytical techniques, including chromatographic and spectroscopic methods, have been used for determining ketoprofen in different sample forms such as a tablet, capsule, ampoule, gel, and human plasma. The limit of detection of ketoprofen was 0.1 ng/ ml using liquid chromatography with tandem mass spectrometry, while it was 0.01- 0.30 µg/ ml using high performance liquid chromatography and 0.00004 - 0.436 µg/ ml, 0.82 µg/ ml, 1.0 µg/ ml, 10 µg/ ml and 208.5 - 237.6 µg/ ml using flow injection, electrokinetic chromatography, capillary electrophoresis, gas chromatography-flame ionisation detection and derivative infrared spectroscopy respectively.
There is no access to basic sanitation for half the world's population, leading to Socioeconomic issues, such as scarcity of drinking water and the spread of diseases. In this way, it is of vital importance to develop water management technologies relevant to the target population. In addition, in the separation form of water treatment, the compound often used as a coagulant in water treatment is aluminum sulfate, which provides good results for raw water turbidity and color removal. Studies show, however, that its deposition in the human body, even Alzheimer's disease, can cause serious harm to health and disease development. The study aims to improve the coagulation/flocculation stage related to the amount of flakes, i
... Show MoreCloud computing is a newly developed concept that aims to provide computing resources in the most effective and economical manner. The fundamental idea of cloud computing is to share computing resources among a user group. Cloud computing security is a collection of control-based techniques and strategies that intends to comply with regulatory compliance rules and protect cloud computing-related information, data apps, and infrastructure. On the other hand, data integrity is a guarantee that the digital data are not corrupted, and that only those authorized people can access or modify them (i.e., maintain data consistency, accuracy, and confidence). This review presents an overview of cloud computing concepts, its importance in many
... Show MoreThis review is concluded of 8-Hydroxyquinline (8HQ) compound and derivatives which has a very significant interests with a strong fluorescence , furthermore the relationship between divalent metal ions and characteristic of chelating . In the same way coordinated features have increase of its organic action and inorganic behavior by giving many samples of compounds which are a good chelating agents ligands with more capable of forming very stable complexes.Therefore, the role of (8HQ) is not limited on complexes only but its applications in different fields so this review will focus on demonstration preparation methods and properties of (8HQ) derivatives with their complexes and applications, hopefully that we will cover a part of scientifi
... Show MoreUltrasound is a mechanical energy which can generate altering zones of compression and rarefaction along its path in the tissues. Ultrasound imaging can provide a real time screening for blood and multiple organs to aiding the diagnostic and treatment. However, ultrasound has the potential to deposit energy in the blood and tissues causing bio effects which is depending on ultrasound characteristics that including frequency and the amount of intensity. These bio effects include either a stable cavitation presented non thermal effects or inertial cavitation of harmful effect on the tissues. The non-thermal cavitation can add features in diagnostic imaging and treatment more than the inertial cavitation. Ultrasound Contrast agents are a micro
... Show MoreFractal image compression depends on representing an image using affine transformations. The main concern for researches in the discipline of fractal image compression (FIC) algorithm is to decrease encoding time needed to compress image data. The basic technique is that each portion of the image is similar to other portions of the same image. In this process, there are many models that were developed. The presence of fractals was initially noticed and handled using Iterated Function System (IFS); that is used for encoding images. In this paper, a review of fractal image compression is discussed with its variants along with other techniques. A summarized review of contributions is achieved to determine the fulfillment of fractal ima
... Show MoreCredit card fraud has become an increasing problem due to the growing reliance on electronic payment systems and technological advances that have improved fraud techniques. Numerous financial institutions are looking for the best ways to leverage technological advancements to provide better services to their end users, and researchers used various protection methods to provide security and privacy for credit cards. Therefore, it is necessary to identify the challenges and the proposed solutions to address them. This review provides an overview of the most recent research on the detection of fraudulent credit card transactions to protect those transactions from tampering or improper use, which includes imbalance classes, c
... Show MoreSince Internet Protocol version 6 is a new technology, insecure network configurations are inevitable. The researchers contributed a lot to spreading knowledge about IPv6 vulnerabilities and how to address them over the past two decades. In this study, a systematic literature review is conducted to analyze research progress in IPv6 security field following the Preferred Reporting Items for the Systematics Review and Meta-Analysis (PRISMA) method. A total of 427 studies have been reviewed from two databases, IEEE and Scopus. To fulfil the review goal, several key data elements were extracted from each study and two kinds of analysis were administered: descriptive analysis and literature classification. The results show positive signs of t
... Show MoreIn this paper, we derived an estimator of reliability function for Laplace distribution with two parameters using Bayes method with square error loss function, Jeffery’s formula and conditional probability random variable of observation. The main objective of this study is to find the efficiency of the derived Bayesian estimator compared to the maximum likelihood of this function and moment method using simulation technique by Monte Carlo method under different Laplace distribution parameters and sample sizes. The consequences have shown that Bayes estimator has been more efficient than the maximum likelihood estimator and moment estimator in all samples sizes