Digital tampering identification, which detects picture modification, is a significant area of image analysis studies. This area has grown with time with exceptional precision employing machine learning and deep learning-based strategies during the last five years. Synthesis and reinforcement-based learning techniques must now evolve to keep with the research. However, before doing any experimentation, a scientist must first comprehend the current state of the art in that domain. Diverse paths, associated outcomes, and analysis lay the groundwork for successful experimentation and superior results. Before starting with experiments, universal image forensics approaches must be thoroughly researched. As a result, this review of various methodologies in the field was created. Unlike previous studies that focused on picture splicing or copy-move detection, this study intends to investigate the universal type-independent strategies required to identify image tampering. The work provided analyses and evaluates several universal techniques based on resampling, compression, and inconsistency-based detection. Journals and datasets are two examples of resources beneficial to the academic community. Finally, a future reinforcement learning model is proposed.
Collapsible soil has a metastable structure that experiences a large reduction in volume or collapse when wetting. The characteristics of collapsible soil contribute to different problems for infrastructures constructed on its such as cracks and excessive settlement found in buildings, railways channels, bridges, and roads. This paper aims to provide an art review on collapse soil behavior all over the world, type of collapse soil, identification of collapse potential, and factors that affect collapsibility soil. As urban grow in several parts of the world, the collapsible soil will have more get to the water. As a result, there will be an increase in the number of wetting collapse problems, so it's very important to com
... Show MoreThis paper proposes a new encryption method. It combines two cipher algorithms, i.e., DES and AES, to generate hybrid keys. This combination strengthens the proposed W-method by generating high randomized keys. Two points can represent the reliability of any encryption technique. Firstly, is the key generation; therefore, our approach merges 64 bits of DES with 64 bits of AES to produce 128 bits as a root key for all remaining keys that are 15. This complexity increases the level of the ciphering process. Moreover, it shifts the operation one bit only to the right. Secondly is the nature of the encryption process. It includes two keys and mixes one round of DES with one round of AES to reduce the performance time. The W-method deals with
... Show MoreMetal-organic frameworks (MOFs) have emerged as revolutionary materials for developing advanced biosensors, especially for detecting reactive oxygen species (ROS) and hydrogen peroxide (H₂O₂) in biomedical applications. This comprehensive review explores the current state-of-the-art in MOF-based biosensors, covering fundamental principles, design strategies, performance features, and clinical uses. MOFs offer unique benefits, including exceptional porosity (up to 10,400 m²/g), tunable structures, biocompatibility, and natural enzyme-mimicking properties, making them ideal platforms for sensitive and selective detection of ROS and H₂O₂. Recent advances have shown significant improvements in detection capabilities, with limit
... Show MoreObjectives. This study was carried out to quantitatively evaluate and compare the sealing ability of Endoflas by using differentobturation techniques. Materials and Methods. After 42 extracted primary maxillary incisors and canines were decoronated, theircanals were instrumented with K files of size ranging from #15 to #50. In accordance with the obturation technique, the sampleswere divided into three experimental groups, namely, group I: endodontic pressure syringe, group II: modified disposable syringe,and group III: reamer technique, and two control groups. Dye extraction method was used for leakage evaluation. Data wereanalyzed using one-way ANOVA and Dunnett’s T3 post hoc tests. The level of significance was set at p<0:05. Results.
... Show MoreThis review delves deep into the intricate relationship between urban planning and flood risk management, tracing its historical trajectory and the evolution of methodologies over time. Traditionally, urban centers prioritized defensive measures, like dikes and levees, with an emphasis on immediate solutions over long-term resilience. These practices, though effective in the short term, often overlooked broader environmental implications and the necessity for holistic planning. However, as urban areas burgeoned and climate change introduced new challenges, there has been a marked shift in approach. Modern urban planning now emphasizes integrated blue-green infrastructure, aiming to harmonize human habitation with water cycles. Resil
... Show MoreThe laboratory experiment was conducted in the laboratories of the Musayyib Bridge Company for Molecular Analyzes in the year 2021-2022 to study the molecular analysis of the inbreed lines and their hybrids F1 to estimate the genetic variation at the level of DNA shown by the selected pure inbreed lines and the resulting hybrids F1 of the flowering gene. Five pure inbreed lines of maize were selected (ZA17WR) Late, ZM74, Late, ZM19, Early ZM49WZ (Zi17WZ, Late, ZM49W3E) and their resulting hybrids, according to the study objective, from fifteen different inbreed lines with flowering time. The five inbreed lines were planted for four seasons (spring and fall 2019) and (spring and fall 2
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreBackground: The integration of modern computer-aided design and manufacturing technologies in diagnosis, treatment planning, and appliance construction is changing the way in which orthodontic treatment is provided to patients. The aim of this study is to assess the validity of digital and rapid prototyped orthodontic study models as compared to their original stone models. Materials and methods: The sample of the study consisted of 30 study models with well-aligned, Angle Class I malocclusion. The models were digitized with desktop scanner to create digital models. Digital files were then converted to plastic physical casts using prototyping machine, which utilizes the fused deposition modeling technology. Polylactic acid polymer was chose
... Show MoreThe current study performed in order to detect and quantify epicatechin in two tea samples of Camellia sinensis (black and green tea) by thin layer chromatography (TLC) and high performance liquid chromatography (HPLC). Extraction of epicatechin from black and green tea was done by using two different methods: maceration (cold extraction method) and decoction (hot extraction method) involved using three different solvents which are absolute ethanol, 50% aqueous ethanol and water for both extraction methods using room temperature and direct heat respectively. Crude extracts of two tea samples that obtained from two methods were fractionated by using two solvents with different polarity (chloroform and
... Show More