This paper proposes and tests a computerized approach for constructing a 3D model of blood vessels from angiogram images. The approach is divided into two steps, image features extraction and solid model formation. In the first step, image morphological operations and post-processing techniques are used for extracting geometrical entities from the angiogram image. These entities are the middle curve and outer edges of the blood vessel, which are then passed to a computer-aided graphical system for the second phase of processing. The system has embedded programming capabilities and pre-programmed libraries for automating a sequence of events that are exploited to create a solid model of the blood vessel. The gradient of the middle curve is adopted to steer the vessel’s direction, while the cross-sections of the blood vessel are formed as a sequence of circles lying in planes that are orthogonal to the gradients of the middle curves. The radii for the circles are estimated as a distance between the intersection points of the blood vessel edges with the orthogonal plane to the middle curve gradient. The system then uses these circles and the middle curve gradients to produce a solid volume that represents the 3D shape of the blood vessel. The method was tested and evaluated using different cases of angiogram images, and showed a reasonable agreement between the generated shapes and the tested images.
Today with increase using social media, a lot of researchers have interested in topic extraction from Twitter. Twitter is an unstructured short text and messy that it is critical to find topics from tweets. While topic modeling algorithms such as Latent Semantic Analysis (LSA) and Latent Dirichlet Allocation (LDA) are originally designed to derive topics from large documents such as articles, and books. They are often less efficient when applied to short text content like Twitter. Luckily, Twitter has many features that represent the interaction between users. Tweets have rich user-generated hashtags as keywords. In this paper, we exploit the hashtags feature to improve topics learned
Authentication is the process of determining whether someone or something is, in fact, who or what it is declared to be. As the dependence upon computers and computer networks grows, the need for user authentication has increased. User’s claimed identity can be verified by one of several methods. One of the most popular of these methods is represented by (something user know), such as password or Personal Identification Number (PIN). Biometrics is the science and technology of authentication by identifying the living individual’s physiological or behavioral attributes. Keystroke authentication is a new behavioral access control system to identify legitimate users via their typing behavior. The objective of this paper is to provide user
... Show MoreIris research is focused on developing techniques for identifying and locating relevant biometric features, accurate segmentation and efficient computation while lending themselves to compression methods. Most iris segmentation methods are based on complex modelling of traits and characteristics which, in turn, reduce the effectiveness of the system being used as a real time system. This paper introduces a novel parameterized technique for iris segmentation. The method is based on a number of steps starting from converting grayscale eye image to a bit plane representation, selection of the most significant bit planes followed by a parameterization of the iris location resulting in an accurate segmentation of the iris from the origin
... Show MoreSpeech is the essential way to interact between humans or between human and machine. However, it is always contaminated with different types of environment noise. Therefore, speech enhancement algorithms (SEA) have appeared as a significant approach in speech processing filed to suppress background noise and return back the original speech signal. In this paper, a new efficient two-stage SEA with low distortion is proposed based on minimum mean square error sense. The estimation of clean signal is performed by taking the advantages of Laplacian speech and noise modeling based on orthogonal transform (Discrete Krawtchouk-Tchebichef transform) coefficients distribution. The Discrete Kra
A novel median filter based on crow optimization algorithms (OMF) is suggested to reduce the random salt and pepper noise and improve the quality of the RGB-colored and gray images. The fundamental idea of the approach is that first, the crow optimization algorithm detects noise pixels, and that replacing them with an optimum median value depending on a criterion of maximization fitness function. Finally, the standard measure peak signal-to-noise ratio (PSNR), Structural Similarity, absolute square error and mean square error have been used to test the performance of suggested filters (original and improved median filter) used to removed noise from images. It achieves the simulation based on MATLAB R2019b and the resul
... Show MoreOrthogonal polynomials and their moments serve as pivotal elements across various fields. Discrete Krawtchouk polynomials (DKraPs) are considered a versatile family of orthogonal polynomials and are widely used in different fields such as probability theory, signal processing, digital communications, and image processing. Various recurrence algorithms have been proposed so far to address the challenge of numerical instability for large values of orders and signal sizes. The computation of DKraP coefficients was typically computed using sequential algorithms, which are computationally extensive for large order values and polynomial sizes. To this end, this paper introduces a computationally efficient solution that utilizes the parall
... Show MoreThis paper focuses on the optimization of drilling parameters by utilizing “Taguchi method” to obtain the minimum surface roughness. Nine drilling experiments were performed on Al 5050 alloy using high speed steel twist drills. Three drilling parameters (feed rates, cutting speeds, and cutting tools) were used as control factors, and L9 (33) “orthogonal array” was specified for the experimental trials. Signal to Noise (S/N) Ratio and “Analysis of Variance” (ANOVA) were utilized to set the optimum control factors which minimized the surface roughness. The results were tested with the aid of statistical software package MINITAB-17. After the experimental trails, the tool diameter was found as the most important facto
... Show MoreLeishmaniasis is one of the important parasitic diseases, affecting mainly low social class people indeveloping countries, and is more prevalent and endemic in the tropical and subtropical regions of old worldand new world. Despite ofbroad distribution in Iraq,little known about the geneticcharacteristics of thecausative agents. So this study was aimed to evaluate the genetic varietyoftwo IraqiLeishmaniatropicaisolatesbased on heat shock protein gene sequence 70 (HSP70) in comparison with universal isolates recordedsequences data. After amplification and sequencing of HSP70 gene,the obtainedresults were alignment alongwith homologous Leishmania sequences retrieved from NCBI by using BLAST. The analysis results showedpresence of particular g
... Show More