Dust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system train and test part was applied to dust phenomena historical data. Its data has been collected through the Iraqi Meteorological Organization and Seismology (IMOS) raw dataset with 170237 of 17023 rows and 10 columns. The LSTM model achieved small time, computationally complexity of, and layers number while being effective and accurate for dust prediction. The simulation results reveal that the model's mean square error test reaches 0.12877 and Mean Absolute Error (MAE) test is 0.07411 at the same rates of learning and exact features values of vector in the dense layer, representing a neural network layer deeply is connected to the LSTM training proposed model. Finally, the model suggested enhances monitoring performance.
A security system can be defined as a method of providing a form of protection to any type of data. A sequential process must be performed in most of the security systems in order to achieve good protection. Authentication can be defined as a part of such sequential processes, which is utilized in order to verify the user permission to entree and utilize the system. There are several kinds of methods utilized, including knowledge, and biometric features. The electroencephalograph (EEG) signal is one of the most widely signal used in the bioinformatics field. EEG has five major wave patterns, which are Delta, Theta, Alpha, Beta and Gamma. Every wave has five features which are amplitude, wavelength, period, speed and frequency. The linear
... Show MoreIn this research ,we will study the phenomenon of dust storms for all types
(Suspended dust , rising dust , dust storm) , and its relationship with some climate
variables (Temperature , rainfall ,wind speed , Relative humidity ) through
regression models to three different locations ( Kirkuk , Rutba , Diwaniya ) almost
covering Iraq area for the period (1981 – 2012) . Time series has been addressing the
phenomenon of storms and climate variables for the time period under study to
attain the best models for long range forcast to the dust storms.
Global services with an agent or a multi-agent system are a promising and new research area. However, several measures have been proposed to demonstrate the benefits of agent technology by supporting distributed services and applying smart agent technology in web dynamics. This paper is designed to build a Semantic Web on the World Wide Web (WWW) to enhance the productivity of managing electronic library applications, which poses a problem to researchers and students, represnted by the process of exchanging books from e-libraries, where the process is slow or the library needs large system data.
Performance issues could be appearing from anywhere in a computer system, finding the root cause of those issues is a troublesome issue due to the complexity of the modern systems and applications. Microsoft builds multiple mechanisms to make their engineers understand what is happening inside All Windows versions including Windows 10 Home and the behavior of any application working on it whether Microsoft services or even third-party applications, one of those mechanisms is the Event Tracing for Windows (ETW) which is the core of logging and tracing in Windows operating system to trace the internal events of the system and its applications. This study goes deep into internal process activities to investigat
... Show MoreGiven the high importance of attendance for university students, upon which the possibility of keeping or losing their places in the course is based, it is essential to replace the inefficient manual method of attendance recording with a more efficient one. To handle this problem, technology must be introduced into this process. This paper aims to propose an automatic attendance system based on passive Radio Frequency Identification (RFID), fog, and cloud computing technologies (AASCF). The system has three sides. The first one, which is the Client-side; works on collecting the attendance data then sending a copy from it. The second side, which is the Server-side, works on calculating an absence ratio of all the students during the
... Show MoreFuzzy logic is used to solve the load flow and contingency analysis problems, so decreasing computing time and its the best selection instead of the traditional methods. The proposed method is very accurate with outstanding computation time, which made the fuzzy load flow (FLF) suitable for real time application for small- as well as large-scale power systems. In addition that, the FLF efficiently able to solve load flow problem of ill-conditioned power systems and contingency analysis. The FLF method using Gaussian membership function requires less number of iterations and less computing time than that required in the FLF method using triangular membership function. Using sparsity technique for the input Ybus sparse matrix data gi
... Show MoreIn this paper the design of hybrid retina matching algorithm that is used in identification systems is considered. Retina based recognition is apparent as the most secure method for identification of an identity utilized to differentiate persons.
The characteristics of Speeded up Robust Feature (SURF) and Binary Robust Invariant Scalable Key-Points (BRISK) algorithm have been used in order to produce a fast matching algorithm than the classical ones, those characteristics are important for real-time applications which usually need quick processing of a growing quantity of data. The algorithm is divided into three stages: retinal image processing and segmentation, extracting the lo
... Show MoreAbstract
Zigbee is considered to be one of the wireless sensor networks (WSNs) designed for short-range communications applications. It follows IEEE 802.15.4 specifications that aim to design networks with lowest cost and power consuming in addition to the minimum possible data rate. In this paper, a transmitter Zigbee system is designed based on PHY layer specifications of this standard. The modulation technique applied in this design is the offset quadrature phase shift keying (OQPSK) with half sine pulse-shaping for achieving a minimum possible amount of phase transitions. In addition, the applied spreading technique is direct sequence spread spectrum (DSSS) technique, which has
... Show MoreA remarkable correlation between chaotic systems and cryptography has been established with sensitivity to initial states, unpredictability, and complex behaviors. In one development, stages of a chaotic stream cipher are applied to a discrete chaotic dynamic system for the generation of pseudorandom bits. Some of these generators are based on 1D chaotic map and others on 2D ones. In the current study, a pseudorandom bit generator (PRBG) based on a new 2D chaotic logistic map is proposed that runs side-by-side and commences from random independent initial states. The structure of the proposed model consists of the three components of a mouse input device, the proposed 2D chaotic system, and an initial permutation (IP) table. Statist
... Show More