The recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital media. Our investigation rigorously assesses the capabilities of these advanced LLMs in identifying and differentiating manipulated imagery. We explore how these models process visual data, their effectiveness in recognizing subtle alterations, and their potential in safeguarding against misleading representations. The implications of our findings are far-reaching, impacting areas such as security, media integrity, and the trustworthiness of information in digital platforms. Moreover, the study sheds light on the limitations and strengths of current LLMs in handling complex tasks like image verification, thereby contributing valuable insights to the ongoing discourse on AI ethics and digital media reliability.
In this work, satellite images for Razaza Lake and the surrounding area
district in Karbala province are classified for years 1990,1999 and
2014 using two software programming (MATLAB 7.12 and ERDAS
imagine 2014). Proposed unsupervised and supervised method of
classification using MATLAB software have been used; these are
mean value and Singular Value Decomposition respectively. While
unsupervised (K-Means) and supervised (Maximum likelihood
Classifier) method are utilized using ERDAS imagine, in order to get
most accurate results and then compare these results of each method
and calculate the changes that taken place in years 1999 and 2014;
comparing with 1990. The results from classification indicated that
A novel method for Network Intrusion Detection System (NIDS) has been proposed, based on the concept of how DNA sequence detects disease as both domains have similar conceptual method of detection. Three important steps have been proposed to apply DNA sequence for NIDS: convert the network traffic data into a form of DNA sequence using Cryptography encoding method; discover patterns of Short Tandem Repeats (STR) sequence for each network traffic attack using Teiresias algorithm; and conduct classification process depends upon STR sequence based on Horspool algorithm. 10% KDD Cup 1999 data set is used for training phase. Correct KDD Cup 1999 data set is used for testing phase to evaluate the proposed method. The current experiment results sh
... Show MoreI
In this study, optical fibers were designed and implemented as a chemical sensor based on surface plasmon resonance (SPR) to estimate the age of the oil used in electrical transformers. The study depends on the refractive indices of the oil. The sensor was created by embedding the center portion of the optical fiber in a resin block, followed by polishing, and tapering to create the optical fiber sensor. The tapering time was 50 min. The multi-mode optical fiber was coated with 60 nm thickness gold metal. The deposition length was 4 cm. The sensor's resonance wavelength was 415 nm. The primary sensor parameters were calculated, including sensitivity (6.25), signal-to-noise ratio (2.38), figure of merit (4.88), and accuracy (3.2)
... Show MoreThis research deals with the qualitative and quantitative interpretation of Bouguer gravity anomaly data for a region located to the SW of Qa’im City within Anbar province by using 2D- mapping methods. The gravity residual field obtained graphically by subtracting the Regional Gravity values from the values of the total Bouguer anomaly. The residual gravity field processed in order to reduce noise by applying the gradient operator and 1st directional derivatives filtering. This was helpful in assigning the locations of sudden variation in Gravity values. Such variations may be produced by subsurface faults, fractures, cavities or subsurface facies lateral variations limits. A major fault was predicted to extend with the direction NE-
... Show MoreAbstract :H.pylori is an important cause of gastric duodenal disease, including gastric ulcers, Mucosa-associated lymphoid tissue (MALT), and gastric carcinoma. biosensors are becoming the most extensively studied discipline because the easy, rapid, low-cost, highly sensitive, and highly selective biosensors contribute to advances in next-generation medicines such as individualized medicine and ultrasensitive point-of-care detection of markers for diseases. Five of ten patients diagnosed with H.pylori ranging in age from 15–85 participated in this research. who [gastritis, duodenitis, duodenal ulcer (DU), and peptic ulcer (PU)] Suspected H.pylori colonies w
... Show MoreIt is widely accepted that early diagnosis of Alzheimer's disease (AD) makes it possible for patients to gain access to appropriate health care services and would facilitate the development of new therapies. AD starts many years before its clinical manifestations and a biomarker that provides a measure of changes in the brain in this period would be useful for early diagnosis of AD. Given the rapid increase in the number of older people suffering from AD, there is a need for an accurate, low-cost and easy to use biomarkers that could be used to detect AD in its early stages. Potentially, the electroencephalogram (EEG) can play a vital role in this but at present, no reliable EEG biomarker exists for early diagnosis of AD. The gradual s
... Show More