An oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification, including ResNet50, VGG19, and InceptionV4; They were trained and tested on an open-source satellite image dataset to analyze the algorithms' efficiency and performance and correlated the classification accuracy, precisions, recall, and f1-score. The result shows that InceptionV4 gives the best classification accuracy of 97% for cloudy, desert, green areas, and water, followed by VGG19 with approximately 96% and ResNet50 with 93%. The findings proved that the InceptionV4 algorithm is suitable for classifying oil spills and no spill with satellite images on a validated dataset.
The demand for single photon sources in quantum key distribution (QKD) systems has necessitated the use of weak coherent pulses (WCPs) characterized by a Poissonian distribution. Ensuring security against eavesdropping attacks requires keeping the mean photon number (µ) small and known to legitimate partners. However, accurately determining µ poses challenges due to discrepancies between theoretical calculations and practical implementation. This paper introduces two experiments. The first experiment involves theoretical calculations of µ using several filters to generate the WCPs. The second experiment utilizes a variable attenuator to generate the WCPs, and the value of µ was estimated from the photons detected by the BB
... Show MoreThe work includes synthesis and characterization of some new heterocyclic compounds, as flow: The compound (3) (5-(4-chlorophenyl) -2-hydrazinyl-1,3,4-oxadiazole was synthesized by using two methods; the first method includes the direct reaction between hydrazine hydrate 80% and 5-(4-chlorophenyl)-2- (ethylthio) 1,3,4-oxadiazole (1), the second method involves converting 5-(4-chlorophenyl)-1,3,4-oxadiazol-2-amine (2) to diazonium salt then reducing this salt to compound (3) by stannous chloride. Compound (3) was used as starting material for synthesizing several fused heterocyclic compounds. The compound 6-(4- chlorophenyl)[1,2.4] triazolo [3,4,b][1,3,4] oxadiazole-3-(2H) thione (compound 4) was synthesized from the reaction of compo
... Show MoreBecause of the quick growth of electrical instruments used in noxious gas detection, the importance of gas sensors has increased. X-ray diffraction (XRD) can be used to examine the crystal phase structure of sensing materials, which affects the properties of gas sensing. This contributes to the study of the effect of electrochemical synthesis of titanium dioxide (TiO2) materials with various crystal phase shapes, such as rutile TiO2 (R-TiO2NTs) and anatase TiO2 (A-TiO2NTs). In this work, we have studied the effect of voltage on preparing TiO2 nanotube arrays via the anodization technique for gas sensor applications. The results acquired from XRD, energy dispersion spectro
... Show More
The implementation of technology in the provision of public services and communication to citizens, which is commonly referred to as e-government, has brought multitude of benefits, including enhanced efficiency, accessibility, and transparency. Nevertheless, this approach also presents particular security concerns, such as cyber threats, data breaches, and access control. One technology that can aid in mitigating the effects of security vulnerabilities within e-government is permissioned blockchain. This work examines the performance of the hyperledger fabric private blockchain under high transaction loads by analyzing two scenarios that involve six organizations as case studies. Several parameters, such as transaction send ra
... Show MoreOptimizing the Access Point (AP) deployment has a great role in wireless applications due to the need for providing an efficient communication with low deployment costs. Quality of Service (QoS), is a major significant parameter and objective to be considered along with AP placement as well the overall deployment cost. This study proposes and investigates a multi-level optimization algorithm called Wireless Optimization Algorithm for Indoor Placement (WOAIP) based on Binary Particle Swarm Optimization (BPSO). WOAIP aims to obtain the optimum AP multi-floor placement with effective coverage that makes it more capable of supporting QoS and cost-effectiveness. Five pairs (coverage, AP deployment) of weights, signal thresholds and received s
... Show MoreIn this work, an optical fiber biomedical sensor for detecting the ratio of the hemoglobin in the blood is presented. A surface plasmon resonance (SPR)-based coreless optical fiber was developed and implemented using single- and multi-mode optical fibers. The sensor is also utilized to evaluate refractive indices and concentrations of hemoglobin in blood samples, with 40 nm thickness of (20 nm Au and 20 nm Ag) to increase the sensitivity. It is found in practice that when the sensitive refractive index increases, the resonant wavelength increases due to the decrease in energy.
<p>In combinatorial testing development, the fabrication of covering arrays is the key challenge by the multiple aspects that influence it. A wide range of combinatorial problems can be solved using metaheuristic and greedy techniques. Combining the greedy technique utilizing a metaheuristic search technique like hill climbing (HC), can produce feasible results for combinatorial tests. Methods based on metaheuristics are used to deal with tuples that may be left after redundancy using greedy strategies; then the result utilization is assured to be near-optimal using a metaheuristic algorithm. As a result, the use of both greedy and HC algorithms in a single test generation system is a good candidate if constructed correctly. T
... Show MoreThis paper proposed a new method for network self-fault management (NSFM) based on two technologies: intelligent agent to automate fault management tasks, and Windows Management Instrumentations (WMI) to identify the fault faster when resources are independent (different type of devices). The proposed network self-fault management reduced the load of network traffic by reducing the request and response between the server and client, which achieves less downtime for each node in state of fault occurring in the client. The performance of the proposed system is measured by three measures: efficiency, availability, and reliability. A high efficiency average is obtained depending on the faults occurred in the system which reaches to
... Show More