This paper proposes and tests a computerized approach for constructing a 3D model of blood vessels from angiogram images. The approach is divided into two steps, image features extraction and solid model formation. In the first step, image morphological operations and post-processing techniques are used for extracting geometrical entities from the angiogram image. These entities are the middle curve and outer edges of the blood vessel, which are then passed to a computer-aided graphical system for the second phase of processing. The system has embedded programming capabilities and pre-programmed libraries for automating a sequence of events that are exploited to create a solid model of the blood vessel. The gradient of the middle curve is adopted to steer the vessel’s direction, while the cross-sections of the blood vessel are formed as a sequence of circles lying in planes that are orthogonal to the gradients of the middle curves. The radii for the circles are estimated as a distance between the intersection points of the blood vessel edges with the orthogonal plane to the middle curve gradient. The system then uses these circles and the middle curve gradients to produce a solid volume that represents the 3D shape of the blood vessel. The method was tested and evaluated using different cases of angiogram images, and showed a reasonable agreement between the generated shapes and the tested images.
Features is the description of the image contents which could be corner, blob or edge. Corners are one of the most important feature to describe image, therefore there are many algorithms to detect corners such as Harris, FAST, SUSAN, etc. Harris is a method for corner detection and it is an efficient and accurate feature detection method. Harris corner detection is rotation invariant but it isn’t scale invariant. This paper presents an efficient harris corner detector invariant to scale, this improvement done by using gaussian function with different scales. The experimental results illustrate that it is very useful to use Gaussian linear equation to deal with harris weakness.
In this paper, a fast lossless image compression method is introduced for compressing medical images, it is based on splitting the image blocks according to its nature along with using the polynomial approximation to decompose image signal followed by applying run length coding on the residue part of the image, which represents the error caused by applying polynomial approximation. Then, Huffman coding is applied as a last stage to encode the polynomial coefficients and run length coding. The test results indicate that the suggested method can lead to promising performance.
In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.
Starting from 4, - Dimercaptobiphenyl, a variety of phenolic Schiff bases (methylolic, etheric, epoxy) derivatives have been synthesized. All proposed structure were supported by FTIR, 1H-NMR, 13C-NMR Elemental analysis all analysis were performed in center of consultation in Jordan Universty.
This paper presents the Taguchi approach for optimization of hardness for shape memory alloy (Cu-Al-Ni) . The influence of powder metallurgy parameters on hardness has been investigated. Taguchi technique and ANOVA were used for analysis. Nine experimental runs based on Taguchi’s L9 orthogonal array were performed (OA),for two parameters was study (Pressure and sintering temperature) for three different levels (300 ,500 and 700) MPa ,(700 ,800 and 900)oC respectively . Main effect, signal-to-noise (S/N) ratio was study, and analysis of variance (ANOVA) using to investigate the micro-hardness characteristics of the shape memory alloy .after application the result of study shown the hei
... Show MoreEffect of Using Computer in Getting and Remaining Information at Students of First Stage in Biology Subject MIAAD NATHIM RASHEED LECTURER Abstract The recent research goal is to know the influence of computer use to earn and fulfillment information for students of first class in biology material and to achieve that put many of the zeroing hypothesis by researcher as follow: There were no differences between statistical signs at level (0,05) between the average students' marks who they were study by using computer and between the average student ' marks who they were study in classical method of earning and fulfillment. The researcher chose the intentional of the medical technical institute that included of two branches the first class (A
... Show MoreThis paper focuses on the most important element of scientific research: the research problem which is confined to the concept of concern or concern surrounding the researcher about any event or phenomenon or issue paper and need to be studied and addressed in order to find solutions for them, to influence the most scientific research steps from asking questions and formulating hypotheses, to employ suitable methods and tools to choose the research and sample community, to employ measurement and analysis tools. This problem calls for a great effort by the researcher intellectually or materially to develop solutions.
In this paper the queuing system (M/Er/1/N) has been considered in equilibrium. The method of stages introduced by Erlang has been used. The system of equations which governs the equilibrium probabilities of various stages has been given. For general N the probability of j stages of service are left in the system, has been introduced. And the probability for the empty system has been calculated in the explicit form.
The searching process using a binary codebook of combined Block Truncation Coding (BTC) method and Vector Quantization (VQ), i.e. a full codebook search for each input image vector to find the best matched code word in the codebook, requires a long time. Therefore, in this paper, after designing a small binary codebook, we adopted a new method by rotating each binary code word in this codebook into 900 to 2700 step 900 directions. Then, we systematized each code word depending on its angle to involve four types of binary code books (i.e. Pour when , Flat when , Vertical when, or Zigzag). The proposed scheme was used for decreasing the time of the coding procedure, with very small distortion per block, by designing s
... Show More