Akaike’s Information Criterion (AIC) is a popular method for estimation the number of sources impinging on an array of sensors, which is a problem of great interest in several applications. The performance of AIC degrades under low Signal-to-Noise Ratio (SNR). This paper is concerned with the development and application of quadrature mirror filters (QMF) for improving the performance of AIC. A new system is proposed to estimate the number of sources by applying AIC to the outputs of filter bank consisting quadrature mirror filters (QMF). The proposed system can estimate the number of sources under low signal-to-noise ratio (SNR).
This paper proposes a new approach, of Clustering Ultrasound images using the Hybrid Filter (CUHF) to determine the gender of the fetus in the early stages. The possible advantage of CUHF, a better result can be achieved when fuzzy c-mean FCM returns incorrect clusters. The proposed approach is conducted in two steps. Firstly, a preprocessing step to decrease the noise presented in ultrasound images by applying the filters: Local Binary Pattern (LBP), median, median and discrete wavelet (DWT),(median, DWT & LBP) and (median & Laplacian) ML. Secondly, implementing Fuzzy C-Mean (FCM) for clustering the resulted images from the first step. Amongst those filters, Median & Laplace has recorded a better accuracy. Our experimental evaluation on re
... Show MoreThe world and the business environment are constantly witnessing many economic changes that have led to the expansion of the business' volume due to mergers and the increase in an investments volume and the complexity of business and the transformation of some systems, which was reflected on the size of the risk and uncertainty which led to necessity of a presence of transparent and objective accounting information In the way that reflects the financial performance of the economic units to be available to all users of that information, therefore, The need for the existence of indicators for transparency in the disclosure of accounting information that these units adhere to. Standards & Poor's indicators, which included items
... Show MoreThe present paper concerns with the problem of estimating the reliability system in the stress – strength model under the consideration non identical and independent of stress and strength and follows Lomax Distribution. Various shrinkage estimation methods were employed in this context depend on Maximum likelihood, Moment Method and shrinkage weight factors based on Monte Carlo Simulation. Comparisons among the suggested estimation methods have been made using the mean absolute percentage error criteria depend on MATLAB program.
<p>Generally, The sending process of secret information via the transmission channel or any carrier medium is not secured. For this reason, the techniques of information hiding are needed. Therefore, steganography must take place before transmission. To embed a secret message at optimal positions of the cover image under spatial domain, using the developed particle swarm optimization algorithm (Dev.-PSO) to do that purpose in this paper based on Least Significant Bits (LSB) using LSB substitution. The main aim of (Dev. -PSO) algorithm is determining an optimal paths to reach a required goals in the specified search space based on disposal of them, using (Dev.-PSO) algorithm produces the paths of a required goals with most effi
... Show MoreVoice denoising is the process of removing undesirable voices from the voice signal. Within the environmental noise and after the application of speech recognition system, the discriminative model finds it difficult to recognize the waveform of the voice signal. This is due to the fact that the environmental noise needs to use a suitable filter that does not affect the shaped waveform of the input microphone. This paper plans to build up a procedure for a discriminative model, using infinite impulse response filter (Butterworth filter) and local polynomial approximation (Savitzky-Golay) smoothing filter that is a polynomial regression on the signal values. Signal to noise ratio (SNR) was calculated after filtering to compare the results
... Show MoreIn this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process, where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab
... Show MoreThe evolution of the Internet of things (IoT) led to connect billions of heterogeneous physical devices together to improve the quality of human life by collecting data from their environment. However, there is a need to store huge data in big storage and high computational capabilities. Cloud computing can be used to store big data. The data of IoT devices is transferred using two types of protocols: Message Queuing Telemetry Transport (MQTT) and Hypertext Transfer Protocol (HTTP). This paper aims to make a high performance and more reliable system through efficient use of resources. Thus, load balancing in cloud computing is used to dynamically distribute the workload across nodes to avoid overloading any individual r
... Show MoreA fault is an error that has effects on system behaviour. A software metric is a value that represents the degree to which software processes work properly and where faults are more probable to occur. In this research, we study the effects of removing redundancy and log transformation based on threshold values for identifying faults-prone classes of software. The study also contains a comparison of the metric values of an original dataset with those after removing redundancy and log transformation. E-learning and system dataset were taken as case studies. The fault ratio ranged from 1%-31% and 0%-10% for the original dataset and 1%-10% and 0%-4% after removing redundancy and log transformation, respectively. These results impacted direct
... Show MoreThe present research aims to design an electronic system based on cloud computing to develop electronic tasks for students of the University of Mosul. Achieving this goal required designing an electronic system that includes all theoretical information, applied procedures, instructions, orders for computer programs, and identifying its effectiveness in developing Electronic tasks for students of the University of Mosul. Accordingly, the researchers formulated three hypotheses related to the cognitive and performance aspects of the electronic tasks. To verify the research hypotheses, a sample of (91) students is intentionally chosen from the research community, represented by the students of the college of education for humanities and col
... Show MoreCurrently no one can deny the importance of data protection, especially with the proliferation of hackers and theft of personal information in all parts of the world .for these reasons the encryption has become one of the important fields in the protection of digital information.
This paper adopts a new image encryption method to overcome the obstacles to previous image encryption methods, where our method will be used Duffing map to shuffled all image pixels ,after that the resulting image will be divided into a group of blocks for perform the shuffling process via Cross Chaotic Map.
Finally, an image called key image was created by using Quadratic number spirals which will be used to generate nu