Signal denoising is directly related to sample estimation of received signals, either by estimating the equation parameters for the target reflections or the surrounding noise and clutter accompanying the data of interest. Radar signals recorded using analogue or digital devices are not immune to noise. Random or white noise with no coherency is mainly produced in the form of random electrons, and caused by heat, environment, and stray circuitry loses. These factors influence the output signal voltage, thus creating detectable noise. Differential Evolution (DE) is an effectual, competent, and robust optimisation method used to solve different problems in the engineering and scientific domains, such as in signal processing. This paper looks at the feasibility of using the differential evolution algorithm to estimate the linear frequency modulation received signal parameters for radar signal denoising. The results gave high target recognition and showed feasibility to denoise received signals.
The consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen
... Show MoreThree-dimensional (3D) reconstruction from images is a most beneficial method of object regeneration by using a photo-realistic way that can be used in many fields. For industrial fields, it can be used to visualize the cracks within alloys or walls. In medical fields, it has been used as 3D scanner to reconstruct some human organs such as internal nose for plastic surgery or to reconstruct ear canal for fabricating a hearing aid device, and others. These applications need high accuracy details and measurement that represent the main issue which should be taken in consideration, also the other issues are cost, movability, and ease of use which should be taken into consideration. This work has presented an approach for design and construc
... Show MoreIn this research we study a variance component model, Which is the one of the most important models widely used in the analysis of the data, this model is one type of a multilevel models, and it is considered as linear models , there are three types of linear variance component models ,Fixed effect of linear variance component model, Random effect of linear variance component model and Mixed effect of linear variance component model . In this paper we will examine the model of mixed effect of linear variance component model with one –way random effect ,and the mixed model is a mixture of fixed effect and random effect in the same model, where it contains the parameter (μ) and treatment effect (τi ) which has
... Show MoreAbsence or hypoplasia of the internal carotid artery (ICA) is a rare congenital anomaly that is mostly unilateral and highly associated with other intracranial vascular anomalies, of which saccular aneurysm is the most common. Blood flow to the circulation of the affected side is maintained by collateral pathways, some of which include the anterior communicating artery (Acom) as part of their anatomy. Therefore, temporary clipping during microsurgery on Acom aneurysms in patients with unilateral ICA anomalies could jeopardize these collaterals and place the patient at risk of ischemic damage. In this paper, we review the literature on cases with a unilaterally absent ICA associa
International companies are striving to reduce their costs and increase their profits, and these trends have produced many methods and techniques to achieve these goals. these methods is heuristic and the other Optimization.. The research includes an attempt to adapt some of these techniques in the Iraqi companies, and these techniques are to determine the optimal lot size using the algorithms Wagner-Whitin under the theory of constraints. The research adopted the case study methodology to objectively identify the problem of research, namely determining lot size optimal for each of the products of electronic measurement laboratory in Diyala and in light of the bottlenecks in w
... Show MoreIn this paper, several combination algorithms between Partial Update LMS (PU LMS) methods and previously proposed algorithm (New Variable Length LMS (NVLLMS)) have been developed. Then, the new sets of proposed algorithms were applied to an Acoustic Echo Cancellation system (AEC) in order to decrease the filter coefficients, decrease the convergence time, and enhance its performance in terms of Mean Square Error (MSE) and Echo Return Loss Enhancement (ERLE). These proposed algorithms will use the Echo Return Loss Enhancement (ERLE) to control the operation of filter's coefficient length variation. In addition, the time-varying step size is used.The total number of coefficients required was reduced by about 18% , 10% , 6%
... Show MoreThe article is devoted to the Russian-Arabic translation, a particular theory of which has not been developed in domestic translation studies to the extent that the mechanisms of translation from and into European languages are described. In this regard, as well as with the growing volumes of Russian-Arabic translation, the issues of this private theory of translation require significant additions and new approaches. The authors set the task of determining the means of translation (cognitive and mental operations and language transformations) that contribute to the achievement of the most equivalent correspondences of such typologically different languages as Russian and Arabic. The work summarizes and analyzes the accumulated exper
... Show Moreorder to increase the level of security, as this system encrypts the secret image before sending it through the internet to the recipient (by the Blowfish method). As The Blowfish method is known for its efficient security; nevertheless, the encrypting time is long. In this research we try to apply the smoothing filter on the secret image which decreases its size and consequently the encrypting and decrypting time are decreased. The secret image is hidden after encrypting it into another image called the cover image, by the use of one of these two methods" Two-LSB" or" Hiding most bits in blue pixels". Eventually we compare the results of the two methods to determine which one is better to be used according to the PSNR measurs
The aim of this study was to Identifying The Effect of using Linear programming and Branching programming by computer in Learning and Retention of movement concatenation(Linkwork) in parallel bars in Artistic Gymnastics. The searchers have used the experimental method. The search subject of this article has been taken (30) male - students in the second class from the College of Physical Education/University of Baghdad divided into three groups; the first group applied linear programming by computer, and the second group has been applicated branching programming by computer, while precision group used traditional method in the college. The researchers concluded the results by using the statistical bag for social sciences (spss) such as both
... Show More