Dust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system train and test part was applied to dust phenomena historical data. Its data has been collected through the Iraqi Meteorological Organization and Seismology (IMOS) raw dataset with 170237 of 17023 rows and 10 columns. The LSTM model achieved small time, computationally complexity of, and layers number while being effective and accurate for dust prediction. The simulation results reveal that the model's mean square error test reaches 0.12877 and Mean Absolute Error (MAE) test is 0.07411 at the same rates of learning and exact features values of vector in the dense layer, representing a neural network layer deeply is connected to the LSTM training proposed model. Finally, the model suggested enhances monitoring performance.
This study investigates the feasibility of a mobile robot navigating and discovering its location in unknown environments, followed by the creation of maps of these navigated environments for future use. First, a real mobile robot named TurtleBot3 Burger was used to achieve the simultaneous localization and mapping (SLAM) technique for a complex environment with 12 obstacles of different sizes based on the Rviz library, which is built on the robot operating system (ROS) booted in Linux. It is possible to control the robot and perform this process remotely by using an Amazon Elastic Compute Cloud (Amazon EC2) instance service. Then, the map to the Amazon Simple Storage Service (Amazon S3) cloud was uploaded. This provides a database
... Show MoreToday in the digital realm, where images constitute the massive resource of the social media base but unfortunately suffer from two issues of size and transmission, compression is the ideal solution. Pixel base techniques are one of the modern spatially optimized modeling techniques of deterministic and probabilistic bases that imply mean, index, and residual. This paper introduces adaptive pixel-based coding techniques for the probabilistic part of a lossy scheme by incorporating the MMSA of the C321 base along with the utilization of the deterministic part losslessly. The tested results achieved higher size reduction performance compared to the traditional pixel-based techniques and the standard JPEG by about 40% and 50%,
... Show MoreIn this paper, some relations between the flows and the Enveloping Semi-group were studied. It allows to associate some properties on the topological compactification to any pointed flows. These relations enable us to study a number of the properties of the principles of flows corresponding with using algebric properties. Also in this paper proofs to some theorems of these relations are given.
Sphingolipids are key components of eukaryotic membranes, particularly the plasma membrane. The biosynthetic pathway for the formation of these lipid species is largely conserved. However, in contrast to mammals, which produce sphingomyelin, organisms such as the pathogenic fungi and protozoa synthesize inositol phosphorylceramide (IPC) as the primary phosphosphingolipid. The key step involves the reaction of ceramide and phosphatidylinositol catalysed by IPC synthase, an essential enzyme with no mammalian equivalent encoded by the AUR1 gene in yeast and recently identified functional orthologues in the pathogenic kinetoplastid protozoa. As such this enzyme represents a promising target for novel anti-fungal and anti-protozoal drugs. Given
... Show MoreIn this research we will present the signature as a key to the biometric authentication technique. I shall use moment invariants as a tool to make a decision about any signature which is belonging to the certain person or not. Eighteen voluntaries give 108 signatures as a sample to test the proposed system, six samples belong to each person were taken. Moment invariants are used to build a feature vector stored in this system. Euclidean distance measure used to compute the distance between the specific signatures of persons saved in this system and with new sample acquired to same persons for making decision about the new signature. Each signature is acquired by scanner in jpg format with 300DPI. Matlab used to implement this system.
Watermarking operation can be defined as a process of embedding special wanted and reversible information in important secure files to protect the ownership or information of the wanted cover file based on the proposed singular value decomposition (SVD) watermark. The proposed method for digital watermark has very huge domain for constructing final number and this mean protecting watermark from conflict. The cover file is the important image need to be protected. A hidden watermark is a unique number extracted from the cover file by performing proposed related and successive operations, starting by dividing the original image into four various parts with unequal size. Each part of these four treated as a separate matrix and applying SVD
... Show MoreThe huge amount of documents in the internet led to the rapid need of text classification (TC). TC is used to organize these text documents. In this research paper, a new model is based on Extreme Machine learning (EML) is used. The proposed model consists of many phases including: preprocessing, feature extraction, Multiple Linear Regression (MLR) and ELM. The basic idea of the proposed model is built upon the calculation of feature weights by using MLR. These feature weights with the extracted features introduced as an input to the ELM that produced weighted Extreme Learning Machine (WELM). The results showed a great competence of the proposed WELM compared to the ELM.
The quality of Global Navigation Satellite Systems (GNSS) networks are considerably influenced by the configuration of the observed baselines. Where, this study aims to find an optimal configuration for GNSS baselines in terms of the number and distribution of baselines to improve the quality criteria of the GNSS networks. First order design problem (FOD) was applied in this research to optimize GNSS network baselines configuration, and based on sequential adjustment method to solve its objective functions.
FOD for optimum precision (FOD-p) was the proposed model which based on the design criteria of A-optimality and E-optimality. These design criteria were selected as objective functions of precision, whic
... Show MoreIn this research The study of Multi-level model (partial pooling model) we consider The partial pooling model which is one Multi-level models and one of the Most important models and extensive use and application in the analysis of the data .This Model characterized by the fact that the treatments take hierarchical or structural Form, in this partial pooling models, Full Maximum likelihood FML was used to estimated parameters of partial pooling models (fixed and random ), comparison between the preference of these Models, The application was on the Suspended Dust data in Iraq, The data were for four and a half years .Eight stations were selected randomly among the stations in Iraq. We use Akaik′s Informa
... Show More