An oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification, including ResNet50, VGG19, and InceptionV4; They were trained and tested on an open-source satellite image dataset to analyze the algorithms' efficiency and performance and correlated the classification accuracy, precisions, recall, and f1-score. The result shows that InceptionV4 gives the best classification accuracy of 97% for cloudy, desert, green areas, and water, followed by VGG19 with approximately 96% and ResNet50 with 93%. The findings proved that the InceptionV4 algorithm is suitable for classifying oil spills and no spill with satellite images on a validated dataset.
B3LYP/6-31G, DFT method was applied to hypothetical study the design of six carbon nanotube materials based on [8]circulene, through the use of cyclic polymerization of two and three molecules of [8]circulene. Optimized structures of [8]circulene have saddle-shaped. Design of six carbon nanotubes reactions were done by thermodynamically calculating (Δ S, Δ G and Δ H) and the stability of these hypothetical nanotubes depending on the value of HOMO energy level. Nanotubes obtained have the most efficient gap energy, making them potentially useful for solar cell applications.
Leishmaniasis is one of the important parasitic diseases, affecting mainly low social class people indeveloping countries, and is more prevalent and endemic in the tropical and subtropical regions of old worldand new world. Despite ofbroad distribution in Iraq,little known about the geneticcharacteristics of thecausative agents. So this study was aimed to evaluate the genetic varietyoftwo IraqiLeishmaniatropicaisolatesbased on heat shock protein gene sequence 70 (HSP70) in comparison with universal isolates recordedsequences data. After amplification and sequencing of HSP70 gene,the obtainedresults were alignment alongwith homologous Leishmania sequences retrieved from NCBI by using BLAST. The analysis results showedpresence of particular g
... Show MoreAny software application can be divided into four distinct interconnected domains namely, problem domain, usage domain, development domain and system domain. A methodology for assistive technology software development is presented here that seeks to provide a framework for requirements elicitation studies together with their subsequent mapping implementing use-case driven object-oriented analysis for component based software architectures. Early feedback on user interface components effectiveness is adopted through process usability evaluation. A model is suggested that consists of the three environments; problem, conceptual, and representational environments or worlds. This model aims to emphasize on the relationship between the objects
... Show MoreImage databases are increasing exponentially because of rapid developments in social networking and digital technologies. To search these databases, an efficient search technique is required. CBIR is considered one of these techniques. This paper presents a multistage CBIR to address the computational cost issues while reasonably preserving accuracy. In the presented work, the first stage acts as a filter that passes images to the next stage based on SKTP, which is the first time used in the CBIR domain. While in the second stage, LBP and Canny edge detectors are employed for extracting texture and shape features from the query image and images in the newly constructed database. The p
This paper focuses on the optimization of drilling parameters by utilizing “Taguchi method” to obtain the minimum surface roughness. Nine drilling experiments were performed on Al 5050 alloy using high speed steel twist drills. Three drilling parameters (feed rates, cutting speeds, and cutting tools) were used as control factors, and L9 (33) “orthogonal array” was specified for the experimental trials. Signal to Noise (S/N) Ratio and “Analysis of Variance” (ANOVA) were utilized to set the optimum control factors which minimized the surface roughness. The results were tested with the aid of statistical software package MINITAB-17. After the experimental trails, the tool diameter was found as the most important facto
... Show MoreBiometrics represent the most practical method for swiftly and reliably verifying and identifying individuals based on their unique biological traits. This study addresses the increasing demand for dependable biometric identification systems by introducing an efficient approach to automatically recognize ear patterns using Convolutional Neural Networks (CNNs). Despite the widespread adoption of facial recognition technologies, the distinct features and consistency inherent in ear patterns provide a compelling alternative for biometric applications. Employing CNNs in our research automates the identification process, enhancing accuracy and adaptability across various ear shapes and orientations. The ear, being visible and easily captured in
... Show MorePattern matching algorithms are usually used as detecting process in intrusion detection system. The efficiency of these algorithms is affected by the performance of the intrusion detection system which reflects the requirement of a new investigation in this field. Four matching algorithms and a combined of two algorithms, for intrusion detection system based on new DNA encoding, are applied for evaluation of their achievements. These algorithms are Brute-force algorithm, Boyer-Moore algorithm, Horspool algorithm, Knuth-Morris-Pratt algorithm, and the combined of Boyer-Moore algorithm and Knuth–Morris– Pratt algorithm. The performance of the proposed approach is calculated based on the executed time, where these algorithms are applied o
... Show MoreIn recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the
... Show More