The recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital media. Our investigation rigorously assesses the capabilities of these advanced LLMs in identifying and differentiating manipulated imagery. We explore how these models process visual data, their effectiveness in recognizing subtle alterations, and their potential in safeguarding against misleading representations. The implications of our findings are far-reaching, impacting areas such as security, media integrity, and the trustworthiness of information in digital platforms. Moreover, the study sheds light on the limitations and strengths of current LLMs in handling complex tasks like image verification, thereby contributing valuable insights to the ongoing discourse on AI ethics and digital media reliability.
The current study aims to identify the needs in the stories of the Brothers Grimm. The research sample consisted of (3) stories, namely: 1- The story of the Thorn Rose (Sleeping Beauty) 2- The story of Snow White 3- The story of Little Red Riding Hood. The number of pages analyzed reached (15.5) pages, and to achieve the research objectives, Murray's classification of needs was adopted, which contains (36) basic needs that are further divided into (129) sub-needs. The idea was adopted as a unit of analysis and repetition as a unit of enumeration, Reliability was extracted in two ways: 1- Agreement between the researcher and himself over time, where the agreement coefficient reached 97%. The second was agreement between the researcher and tw
... Show MoreThe extraction, study, and accurate interpretation of the morphology database of a basin are the basic blocks for building a valid geomorphological understanding of this basin. In this work, a new approach is presented which is to use three different GIS based methods to extract databases with specific geographical information and then use the concept of information intersection to make a realistic geomorphological perspective for the study area.
In the first method, data integration of remote sensing images from Google Map and SRTM DEM images were used to identify Horan basin borders.
In the second method, the principle of data integration was represented by extracting the quantitative values of the morphometric c
... Show MoreThis paper attempts to develop statistical modeling for air-conditioning analysis in Jakarta, Indonesia, during an emergency state of community activity restrictions enforcement (Emergency CARE), using a variety of parameters such as PM10, PM2.5, SO2, CO, O3, and NO2 from five IoT-based air monitoring systems. The parameters mentioned above are critical for assessing the air quality conditions and concentration of air pollutants. Outdoor air pollution concentration variations before and after the Emergency CARE, which was held in Indonesia during the COVID-19 pandemic on July 3-21, 2021, were studied. An air quality monitoring system based on the IoT generates sensor data
... Show MoreIn this paper, the series solution for unsteady flow for an incompressible viscous electrically conducting fluid of second grad over a stretching sheet subject to a transverse magnetic field is presented by using homotopy analysis method (HAM). Also we examines the effects of viscoelastic parameter, magnetic parameter and time which they control the equation of motion.
In the present work, pattern recognition is carried out by the contrast and relative variance of clouds. The K-mean clustering process is then applied to classify the cloud type; also, texture analysis being adopted to extract the textural features and using them in cloud classification process. The test image used in the classification process is the Meteosat-7 image for the D3 region.The K-mean method is adopted as an unsupervised classification. This method depends on the initial chosen seeds of cluster. Since, the initial seeds are chosen randomly, the user supply a set of means, or cluster centers in the n-dimensional space.The K-mean cluster has been applied on two bands (IR2 band) and (water vapour band).The textural analysis is used
... Show MoreWith the development of high-speed network technologies, there has been a recent rise in the transfer of significant amounts of sensitive data across the Internet and other open channels. The data will be encrypted using the same key for both Triple Data Encryption Standard (TDES) and Advanced Encryption Standard (AES), with block cipher modes called cipher Block Chaining (CBC) and Electronic CodeBook (ECB). Block ciphers are often used for secure data storage in fixed hard drives, portable devices, and safe network data transport. Therefore, to assess the security of the encryption method, it is necessary to become familiar with and evaluate the algorithms of cryptographic systems. Block cipher users need to be sure that the ciphers the
... Show MoreThe present work covers the analytical design process of three dimensional (3-D) hip joint prosthesis with numerical fatigue stress analysis. The analytical generation equations describing the different stem constructive parts (ball, neck, tour, cone, lower ball) have been presented to reform the stem model in a mathematical feature. The generated surface has been introduced to FE solver (Ansys version 11) in order to simulate the induced dynamic stresses and investigate the effect of every design parameter (ball radius, angle of neck, radius of neck, neck ratio, main tour radius, and outer tour radius) on the max. equivalent stresses for hip prosthesis made from titanium alloy. The dynamic loading case has been studied to a stumbling ca
... Show MoreIn this paper, the series solutions of a non-linear delay integral equations are considered by a modified approach of homotopy analysis method (MAHAM). We split the function into infinite sums. The outcomes of the illustrated examples are included to confirm the accuracy and efficiency of the MAHAM. The exact solution can be obtained using special values of the convergence parameter.
The analysis of the hyperlink structure of the web has led to significant improvements in web information retrieval. This survey study evaluates and analyzes relevant research publications on link analysis in web information retrieval utilizing diverse methods. These factors include the research year, the aims of the research article, the algorithms utilized to complete their study, and the findings received after using the algorithms. The findings revealed that Page Rank, Weighted Page Rank, and Weighted Page Content Rank are extensively employed by academics to properly analyze hyperlinks in web information retrieval. Finally, this paper analyzes the previous studies.
chronic obstructive pulmonary disease (COPD) is a common respiratory disease with episodes of exacerbation. Variable factors including infectious pathogen can predispose for this exacerbation. The aim of this study is to evaluate the role of intestinal protozoa in COPD exacerbation. A total of 56 patients with COPD were included in this study. Patients were categorized into two groups based on the frequency of exacerbation during the last 6 months: those with ≤1 exacerbation (32 patients) and those with ≥2 exacerbations (24 patients). Stool specimens from each patient were collected two times (one week interval) examined for intestinal parasite. In univariate analysis, rural residence and parasitic infection were more common among patie
... Show More