Contours extraction from two dimensional echocardiographic images has been a challenge in digital image processing. This is essentially due to the heavy noise, poor quality of these images and some artifacts like papillary muscles, intra-cavity structures as chordate, and valves that can interfere with the endocardial border tracking. In this paper, we will present a technique to extract the contours of heart boundaries from a sequence of echocardiographic images, where it started with pre-processing to reduce noise and produce better image quality. By pre-processing the images, the unclear edges are avoided, and we can get an accurate detection of both heart boundary and movement of heart valves.
In this paper, we describe a new method for image denoising. We analyze properties of the Multiwavelet coefficients of natural images. Also it suggests a method for computing the Multiwavelet transform using the 1st order approximation. This paper describes a simple and effective model for noise removal through suggesting a new technique for retrieving the image by allowing us to estimate it from the noisy image. The proposed algorithm depends on mixing both soft-thresholds with Mean filter and applying concurrently on noisy image by dividing into blocks of equal size (for concurrent processed to increase the performance of the enhancement process and to decease the time that is needed for implementation by applying the proposed algorith
... Show MoreRumors are typically described as remarks whose true value is unknown. A rumor on social media has the potential to spread erroneous information to a large group of individuals. Those false facts will influence decision-making in a variety of societies. In online social media, where enormous amounts of information are simply distributed over a large network of sources with unverified authority, detecting rumors is critical. This research proposes that rumor detection be done using Natural Language Processing (NLP) tools as well as six distinct Machine Learning (ML) methods (Nave Bayes (NB), random forest (RF), K-nearest neighbor (KNN), Logistic Regression (LR), Stochastic Gradient Descent (SGD) and Decision Tree (
... Show MoreEnhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show More
Purpose: Providing practical knowledge of the requirements of a detailed feasibility study for selecting the investment project.
Findings: Directing the private sector towards investing in productive projects - the pre-cast reinforced concrete project - as it achieves a financial return as well as providing Providing foreign currencies by reducing imports and exploiting available natural resources
Practical implications: The importance of a detailed feasibility study to determining whether the project can be implemented or not.
The precast concrete method is one of the best modern c
... Show MoreMany approaches of different complexity already exist to edge detection in
color images. Nevertheless, the question remains of how different are the results
when employing computational costly techniques instead of simple ones. This
paper presents a comparative study on two approaches to color edge detection to
reduce noise in image. The approaches are based on the Sobel operator and the
Laplace operator. Furthermore, an efficient algorithm for implementing the two
operators is presented. The operators have been applied to real images. The results
are presented in this paper. It is shown that the quality of the results increases by
using second derivative operator (Laplace operator). And noise reduced in a good
In data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Iraq is one of the Arabian area countries, which considered from the drier areas
on the earth, though it has two main rivers that pass through(Tigris and Euphrates);
it suffers the same problem as them (drought), only the rivers' nearby regions make
use of their water for (domestic, agricultural, and industrial purposes(.
One of the usable solutions is to utilize the groundwater (especially in the desert
regions). Using the Remote Sensing and geographic information system is a rapid
and coast effective techniques, they provide information of large and inaccessible
area within short span for assessing, monitoring, and management of groundwater
resources. In this study, an adaptive algorithm based on Canny edge dete
Most companies use social media data for business. Sentiment analysis automatically gathers analyses and summarizes this type of data. Managing unstructured social media data is difficult. Noisy data is a challenge to sentiment analysis. Since over 50% of the sentiment analysis process is data pre-processing, processing big social media data is challenging too. If pre-processing is carried out correctly, data accuracy may improve. Also, sentiment analysis workflow is highly dependent. Because no pre-processing technique works well in all situations or with all data sources, choosing the most important ones is crucial. Prioritization is an excellent technique for choosing the most important ones. As one of many Multi-Criteria Decision Mak
... Show MoreCognitive radios have the potential to greatly improve spectral efficiency in wireless networks. Cognitive radios are considered lower priority or secondary users of spectrum allocated to a primary user. Their fundamental requirement is to avoid interference to potential primary users in their vicinity. Spectrum sensing has been identified as a key enabling functionality to ensure that cognitive radios would not interfere with primary users, by reliably detecting primary user signals. In addition, reliable sensing creates spectrum opportunities for capacity increase of cognitive networks. One of the key challenges in spectrum sensing is the robust detection of primary signals in highly negative signal-to-noise regimes (SNR).In this paper ,
... Show More