Shadow removal is crucial for robot and machine vision as the accuracy of object detection is greatly influenced by the uncertainty and ambiguity of the visual scene. In this paper, we introduce a new algorithm for shadow detection and removal based on different shapes, orientations, and spatial extents of Gaussian equations. Here, the contrast information of the visual scene is utilized for shadow detection and removal through five consecutive processing stages. In the first stage, contrast filtering is performed to obtain the contrast information of the image. The second stage involves a normalization process that suppresses noise and generates a balanced intensity at a specific position compared to the neighboring intensities. In the third stage, the boundary of the target object is extracted, and in the fourth and fifth stages, respectively, the region of interest (ROI) is highlighted and reconstructed. Our model was tested and evaluated using realistic scenarios which include outdoor and indoor scenes. The results reflect the ability of our approach to detect and remove shadows and reconstruct a shadow free image with a small error of approximately 6%.
يهدف البحث الى تحليل الخيارات الاستراتيجية للاقتراض الخارجي في العراق لاستشراف افضل الخيارات الاستراتيجية المستقبلية في مجال الاقتراض الخارجي في دائرة الدين العام في وزارة المالية ، وقد استخدم الباحث منهج دراسة الحالة وباستعمال اسلوب تحليل خوارزمية ال K-Means لتشخيص كفاءة الاقتراض الخارجي لعينة البحث البالغة (81) قرضا التي اقترضتها وزارة المالية للفترة 2007-2020 . ولقد كان الغرض الرئيسي للبحث المساهمة في تمكين وزا
... Show MoreThe aim of the present study was to distinguish between healthy children and those with epilepsy by electroencephalography (EEG). Two biomarkers including Hurst exponents (H) and Tsallis entropy (TE) were used to investigate the background activity of EEG of 10 healthy children and 10 with epilepsy. EEG artifacts were removed using Savitzky-Golay (SG) filter. As it hypothesize, there was a significant changes in irregularity and complexity in epileptic EEG in comparison with healthy control subjects using t-test (p< 0.05). The increasing in complexity changes were observed in H and TE results of epileptic subjects make them suggested EEG biomarker associated with epilepsy and a reliable tool for detection and identification of this di
... Show MoreThe recent emergence of sophisticated Large Language Models (LLMs) such as GPT-4, Bard, and Bing has revolutionized the domain of scientific inquiry, particularly in the realm of large pre-trained vision-language models. This pivotal transformation is driving new frontiers in various fields, including image processing and digital media verification. In the heart of this evolution, our research focuses on the rapidly growing area of image authenticity verification, a field gaining immense relevance in the digital era. The study is specifically geared towards addressing the emerging challenge of distinguishing between authentic images and deep fakes – a task that has become critically important in a world increasingly reliant on digital med
... Show MoreIn this research, the results of the Integral breadth method were used to analyze the X-ray lines to determine the crystallite size and lattice strain of the zirconium oxide nanoparticles and the value of the crystal size was equal to (8.2nm) and the lattice strain (0.001955), and then the results were compared with three other methods, which are the Scherer and Scherer dynamical diffraction theory and two formulas of the Scherer and Wilson method.the results were as followsScherer crystallite size(7.4nm)and lattice strain(0.011968),Schererdynamic method crystallite size(7.5 nm),Scherrer and Wilson methodcrystallite size( 8.5nm) and lattice strain( 0.001919).And using another formula for Schearer and Wilson methodwe obtain the size of the c
... Show MoreEmpirical and statistical methodologies have been established to acquire accurate permeability identification and reservoir characterization, based on the rock type and reservoir performance. The identification of rock facies is usually done by either using core analysis to visually interpret lithofacies or indirectly based on well-log data. The use of well-log data for traditional facies prediction is characterized by uncertainties and can be time-consuming, particularly when working with large datasets. Thus, Machine Learning can be used to predict patterns more efficiently when applied to large data. Taking into account the electrofacies distribution, this work was conducted to predict permeability for the four wells, FH1, FH2, F
... Show MoreChurning of employees from organizations is a serious problem. Turnover or churn of employees within an organization needs to be solved since it has negative impact on the organization. Manual detection of employee churn is quite difficult, so machine learning (ML) algorithms have been frequently used for employee churn detection as well as employee categorization according to turnover. Using Machine learning, only one study looks into the categorization of employees up to date. A novel multi-criterion decision-making approach (MCDM) coupled with DE-PARETO principle has been proposed to categorize employees. This is referred to as SNEC scheme. An AHP-TOPSIS DE-PARETO PRINCIPLE model (AHPTOPDE) has been designed that uses 2-stage MCDM s
... Show MoreApproaching the turning of the millennium, the American theatre witnessed an arousing
interest much shown in patients suffering of severe diseases as a subject matter to drama. In a
discussion of Margaret Edson's Wit, the light is shed on how far such patients, who were literally
involved in secular visions during their life-time, become apt to create a different one on their
death beds. The vision newly blossomed becomes much rooted in the spiritual life; it is a
redemptive vision that can amend what those patients' hearts and minds have long ignored.
Further, the human touch that has been ignored during man's healthy secular life is ultimately
needed for the time being. It helps to enhance man's vision towards the