When images are customized to identify changes that have occurred using techniques such as spectral signature, which can be used to extract features, they can be of great value. In this paper, it was proposed to use the spectral signature to extract information from satellite images and then classify them into four categories. Here it is based on a set of data from the Kaggle satellite imagery website that represents different categories such as clouds, deserts, water, and green areas. After preprocessing these images, the data is transformed into a spectral signature using the Fast Fourier Transform (FFT) algorithm. Then the data of each image is reduced by selecting the top 20 features and transforming them from a two-dimensional matrix to a one-dimensional vector matrix using the Vector Quantization (VQ) algorithm. The data is divided into training and testing. Then it is fed into 23 layers of deep neural networks (DNN) that classify satellite images. The result is 2,145,020 parameters, and the evaluation of performance measures was accuracy = 100%, loopback = 100%, and the result F1 = 100 %.
In this paper, we introduce the bi-normality set, denoted by , which is an extension of the normality set, denoted by for any operators in the Banach algebra . Furthermore, we show some interesting properties and remarkable results. Finally, we prove that it is not invariant via some transpose linear operators.
In this article, an attempt has been made to introduce the concept of Neutrosophic d-Filter and Neutrosophic Prime d-Filter of d-Algebra by generalizing the notion of Intuitionistic Fuzzy d-Filter of d-Algebra. Besides, we establish different properties of them. Further, we study several relations on this notion from the point of view of Neutrosophic d-Algebra.
Establishing complete and reliable coverage for a long time-span is a crucial issue in densely surveillance wireless sensor networks (WSNs). Many scheduling algorithms have been proposed to model the problem as a maximum disjoint set covers (DSC) problem. The goal of DSC based algorithms is to schedule sensors into several disjoint subsets. One subset is assigned to be active, whereas, all remaining subsets are set to sleep. An extension to the maximum disjoint set covers problem has also been addressed in literature to allow for more advance sensors to adjust their sensing range. The problem, then, is extended to finding maximum number of overlapped set covers. Unlike all related works which concern with the disc sensing model, the cont
... Show Morewe applied the direct product concept on the notation of intuitionistic fuzzy semi d-ideals of d-algebra with investigation some theorems, and also, we study the notation of direct product of intuitionistic fuzzy topological d-algebra.
In this article, we introduce a new type of soft spaces namely, soft spaces as a generalization of soft paces. Also, we study the weak forms of soft spaces, namely, soft spaces, soft spaces, soft space, and soft spaces. The characterizations and fundamental properties related to these types of soft spaces and the relationships among them are also discussed.
The estimation of the regular regression model requires several assumptions to be satisfied such as "linearity". One problem occurs by partitioning the regression curve into two (or more) parts and then joining them by threshold point(s). This situation is regarded as a linearity violation of regression. Therefore, the multiphase regression model is received increasing attention as an alternative approach which describes the changing of the behavior of the phenomenon through threshold point estimation. Maximum likelihood estimator "MLE" has been used in both model and threshold point estimations. However, MLE is not resistant against violations such as outliers' existence or in case of the heavy-tailed error distribution. The main goal of t
... Show MoreThe science of information security has become a concern of many researchers, whose efforts are trying to come up with solutions and technologies that ensure the transfer of information in a more secure manner through the network, especially the Internet, without any penetration of that information, given the risk of digital data being sent between the two parties through an insecure channel. This paper includes two data protection techniques. The first technique is cryptography by using Menezes Vanstone elliptic curve ciphering system, which depends on public key technologies. Then, the encoded data is randomly included in the frame, depending on the seed used. The experimental results, using a PSNR within avera
... Show MoreOne of the costliest problems facing the production of hydrocarbons in unconsolidated sandstone reservoirs is the production of sand once hydrocarbon production starts. The sanding start prediction model is very important to decide on sand control in the future, including whether or when sand control should be used. This research developed an easy-to-use Computer program to determine the beginning of sanding sites in the driven area. The model is based on estimating the critical pressure drop that occurs when sand is onset to produced. The outcomes have been drawn as a function of the free sand production with the critical flow rates for reservoir pressure decline. The results show that the pressure drawdown required to
... Show MoreInternet of Things (IoT) contributes to improve the quality of life as it supports many applications, especially healthcare systems. Data generated from IoT devices is sent to the Cloud Computing (CC) for processing and storage, despite the latency caused by the distance. Because of the revolution in IoT devices, data sent to CC has been increasing. As a result, another problem added to the latency was increasing congestion on the cloud network. Fog Computing (FC) was used to solve these problems because of its proximity to IoT devices, while filtering data is sent to the CC. FC is a middle layer located between IoT devices and the CC layer. Due to the massive data generated by IoT devices on FC, Dynamic Weighted Round Robin (DWRR)
... Show MoreThis research aims to choose the appropriate probability distribution to the reliability analysis for an item through collected data for operating and stoppage time of the case study.
Appropriate choice for .probability distribution is when the data look to be on or close the form fitting line for probability plot and test the data for goodness of fit .
Minitab’s 17 software was used for this purpose after arranging collected data and setting it in the the program.
&nb
... Show More