An oil spill is a leakage of pipelines, vessels, oil rigs, or tankers that leads to the release of petroleum products into the marine environment or on land that happened naturally or due to human action, which resulted in severe damages and financial loss. Satellite imagery is one of the powerful tools currently utilized for capturing and getting vital information from the Earth's surface. But the complexity and the vast amount of data make it challenging and time-consuming for humans to process. However, with the advancement of deep learning techniques, the processes are now computerized for finding vital information using real-time satellite images. This paper applied three deep-learning algorithms for satellite image classification, including ResNet50, VGG19, and InceptionV4; They were trained and tested on an open-source satellite image dataset to analyze the algorithms' efficiency and performance and correlated the classification accuracy, precisions, recall, and f1-score. The result shows that InceptionV4 gives the best classification accuracy of 97% for cloudy, desert, green areas, and water, followed by VGG19 with approximately 96% and ResNet50 with 93%. The findings proved that the InceptionV4 algorithm is suitable for classifying oil spills and no spill with satellite images on a validated dataset.
The study presents the modification of the Broyden-Flecher-Goldfarb-Shanno (BFGS) update (H-Version) based on the determinant property of inverse of Hessian matrix (second derivative of the objective function), via updating of the vector s ( the difference between the next solution and the current solution), such that the determinant of the next inverse of Hessian matrix is equal to the determinant of the current inverse of Hessian matrix at every iteration. Moreover, the sequence of inverse of Hessian matrix generated by the method would never approach a near-singular matrix, such that the program would never break before the minimum value of the objective function is obtained. Moreover, the new modification of BFGS update (H-vers
... Show MoreSlow learning becomes a problem in the present , where it comprises ratio musnt ignore in every school . So , its one of education problems facing by parents and teachers .
Slow learning subject regard of new subject attract the attention in the last years of the 20th century where the attention was focusing on the other disabilities but the existence of number of healthy children suffering from Learning problems attract the attention of the researchers .
So , this study aims at recognizing the degree of self , concept among slow learner students and the differences significance of self concept according to sex and academic degree of parents variables.
Because there is not tool , the researcher build a
... Show MoreIn this research, (MOORA) approach based– Taguchi design was used to convert the multi-performance problem into a single-performance problem for nine experiments which built (Taguchi (L9) orthogonal array) for carburization operation. The main variables that had a great effect on carburizing operation are carburization temperature (oC), carburization time (hrs.) and tempering temperature (oC). This study was also focused on calculating the amount of carbon penetration, the value of hardness and optimal values obtained during the optimization by Taguchi approach and MOORA method for multiple parameters. In this study, the carburization process was done in temperature between (850 to 950 ᵒC) for 2 to 6
... Show MoreStereolithography (SLA) has become an essential photocuring 3D printing process for producing parts of complex shapes from photosensitive resin exposed to UV light. The selection of the best printing parameters for good accuracy and surface quality can be further complicated by the geometric complexity of the models. This work introduces multiobjective optimization of SLA printing of 3D dental bridges based on simple CAD objects. The effect of the best combination of a low-cost resin 3D printer’s machine parameter settings, namely normal exposure time, bottom exposure time and bottom layers for less dimensional deviation and surface roughness, was studied. A multiobjective optimization method was utilized, combining the Taguchi me
... Show MoreIdentifying breast cancer utilizing artificial intelligence technologies is valuable and has a great influence on the early detection of diseases. It also can save humanity by giving them a better chance to be treated in the earlier stages of cancer. During the last decade, deep neural networks (DNN) and machine learning (ML) systems have been widely used by almost every segment in medical centers due to their accurate identification and recognition of diseases, especially when trained using many datasets/samples. in this paper, a proposed two hidden layers DNN with a reduction in the number of additions and multiplications in each neuron. The number of bits and binary points of inputs and weights can be changed using the mask configuration
... Show MoreThe technological development in the field of information and communication has been accompanied by the emergence of security challenges related to the transmission of information. Encryption is a good solution. An encryption process is one of the traditional methods to protect the plain text, by converting it into inarticulate form. Encryption implemented can be occurred by using some substitute techniques, shifting techniques, or mathematical operations. This paper proposed a method with two branches to encrypt text. The first branch is a new mathematical model to create and exchange keys, the proposed key exchange method is the development of Diffie-Hellman. It is a new mathematical operations model to exchange keys based on prime num
... Show MoreData security is an important component of data communication and transmission systems. Its main role is to keep sensitive information safe and integrated from the sender to the receiver. The proposed system aims to secure text messages through two security principles encryption and steganography. The system produced a novel method for encryption using graph theory properties; it formed a graph from a password to generate an encryption key as a weight matrix of that graph and invested the Least Significant Bit (LSB) method for hiding the encrypted message in a colored image within a green component. Practical experiments of (perceptibility, capacity, and robustness) were calculated using similarity measures like PSNR, MSE, and
... Show More