Akaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
The investigation of signature validation is crucial to the field of personal authenticity. The biometrics-based system has been developed to support some information security features.Aperson’s signature, an essential biometric trait of a human being, can be used to verify their identification. In this study, a mechanism for automatically verifying signatures has been suggested. The offline properties of handwritten signatures are highlighted in this study which aims to verify the authenticity of handwritten signatures whether they are real or forged using computer-based machine learning techniques. The main goal of developing such systems is to verify people through the validity of their signatures. In this research, images of a group o
... Show MoreDust is a frequent contributor to health risks and changes in the climate, one of the most dangerous issues facing people today. Desertification, drought, agricultural practices, and sand and dust storms from neighboring regions bring on this issue. Deep learning (DL) long short-term memory (LSTM) based regression was a proposed solution to increase the forecasting accuracy of dust and monitoring. The proposed system has two parts to detect and monitor the dust; at the first step, the LSTM and dense layers are used to build a system using to detect the dust, while at the second step, the proposed Wireless Sensor Networks (WSN) and Internet of Things (IoT) model is used as a forecasting and monitoring model. The experiment DL system
... Show MoreThe problem of research is to identify the uses of Iraqi women for Facebook and the motives of these uses and the innovations that have been achieved from them, in view of the increasing role-played by social networking sites in the lives of individuals in general and women in particular.The research seeks to identify the following: Identify the extent to which Iraqi women use Facebook and the usage motives and innovations achieved by the interviewees When they using Facebook. This study belongs to the descriptive studies that are interested in monitoring the characteristics of a particular phenomenon to identify its characteristics and characteristics.This study adopted the survey methodology to test the hypotheses of the study and how
... Show MoreThe aim of this study is to highlight on the concept of joint arrangements and projects in accordance with IFRS 11, in addition, the study also focuses on accounting standards and IFRSs which are relevant to this standard. With a description of the legislative and accounting challenges in the Iraqi environment facing the application of IFRS 11, as well as studying the reality of accounting in such economic activity of companies operating in this sector.
In order to achieve the study objectives, the researcher conducted a comparative analysis between IFRS 11 (Joint Arrangements) and the Iraqi Unified Accounting System, In the second step, IFRS 11 is applied to the Basrah Gas Company's (research sampl
... Show MoreThere are no researches in Iraq concerned identification and ecology of protozoa in sediment. The present study has been dealt with free-living protozoa community of the Tigris river bank sediment in Baghdad city. Variable species of vegetation (reeds and wild grasses) were observed to grow at both sides of the river.
For the present study three sites were chosen at the east side of river Tigris. Monthly samples were collected from the sediment of each site over a period from January to October 2012.Total of 22 taxa were found, 12 of ciliates, 5 of each flagellates and sarcodines in the sediment samples. The highest numbers of protozoan 15 taxa were recorded from each of the sites 1&3 and little less taxa (13) were found in site 2
Recently new concepts such as free data or Volunteered Geographic Information (VGI) emerged on Web 2.0 technologies. OpenStreetMap (OSM) is one of the most representative projects of this trend. Geospatial data from different source often has variable accuracy levels due to different data collection methods; therefore the most concerning problem with (OSM) is its unknown quality. This study aims to develop a specific tool which can analyze and assess the possibility matching of OSM road features with reference dataset using Matlab programming language. This tool applied on two different study areas in Iraq (Baghdad and Karbala), in order to verify if the OSM data has the same quality in both study areas. This program, in general, consists
... Show MoreIn this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.
Iris recognition occupies an important rank among the biometric types of approaches as a result of its accuracy and efficiency. The aim of this paper is to suggest a developed system for iris identification based on the fusion of scale invariant feature transforms (SIFT) along with local binary patterns of features extraction. Several steps have been applied. Firstly, any image type was converted to grayscale. Secondly, localization of the iris was achieved using circular Hough transform. Thirdly, the normalization to convert the polar value to Cartesian using Daugman’s rubber sheet models, followed by histogram equalization to enhance the iris region. Finally, the features were extracted by utilizing the scale invariant feature
... Show More