The increased use of hybrid PET /CT scanners combining detailed anatomical information along withfunctional data has benefits for both diagnostic and therapeutic purposes. This presented study is to makecomparison of cross sections to produce 18F , 82Sr and68Ge via different reactions with particle incident energy up to 60 MeV as a part of systematic studies on particle-induced activations on enriched natNe, natRb, natGa 18O,85Rb, and 69Ga targets, theoretical calculation of production yield, calculation of requiredtarget and suggestion of optimum reaction to produce: Fluorine-18 , Strontium-82 andGermanium-68 touse in Hybrid Machines PET/CT Scanners.
This investigation aims to study some properties of lightweight aggregate concrete reinforced by mono or hybrid fibers of different sizes and types. In this research, the considered lightweight aggregate was Light Expanded Clay Aggregate while the adopted fibers included hooked, straight, polypropylene, and glass. Eleven lightweight concrete mixes were considered, These mixes comprised of; one plain concrete mix (without fibers), two reinforced concrete mixtures of mono fiber (hooked or straight fibers), six reinforced concrete mixtures of double hybrid fibers, and two reinforced concrete mixtures of triple hybrid fibers. Hardened concrete properties were investigated in this study. G
The objective of the current research is to find an optimum design of hybrid laminated moderate thick composite plates with static constraint. The stacking sequence and ply angle is required for optimization to achieve minimum deflection for hybrid laminated composite plates consist of glass and carbon long fibers reinforcements that impeded in epoxy matrix with known plates dimension and loading. The analysis of plate is by adopting the first-order shear deformation theory and using Navier's solution with Genetic Algorithm to approach the current objective. A program written with MATLAB to find best stacking sequence and ply angles that give minimum deflection, and the results comparing with ANSYS.
Text based-image clustering (TBIC) is an insufficient approach for clustering related web images. It is a challenging task to abstract the visual features of images with the support of textual information in a database. In content-based image clustering (CBIC), image data are clustered on the foundation of specific features like texture, colors, boundaries, shapes. In this paper, an effective CBIC) technique is presented, which uses texture and statistical features of the images. The statistical features or moments of colors (mean, skewness, standard deviation, kurtosis, and variance) are extracted from the images. These features are collected in a one dimension array, and then genetic algorithm (GA) is applied for image clustering.
... Show MoreAlthough the Wiener filtering is the optimal tradeoff of inverse filtering and noise smoothing, in the case when the blurring filter is singular, the Wiener filtering actually amplify the noise. This suggests that a denoising step is needed to remove the amplified noise .Wavelet-based denoising scheme provides a natural technique for this purpose .
In this paper a new image restoration scheme is proposed, the scheme contains two separate steps : Fourier-domain inverse filtering and wavelet-domain image denoising. The first stage is Wiener filtering of the input image , the filtered image is inputted to adaptive threshold wavelet
... Show MoreThis paper presents a proposed method for (CBIR) from using Discrete Cosine Transform with Kekre Wavelet Transform (DCT/KWT), and Daubechies Wavelet Transform with Kekre Wavelet Transform (D4/KWT) to extract features for Distributed Database system where clients/server as a Star topology, client send the query image and server (which has the database) make all the work and then send the retrieval images to the client. A comparison between these two approaches: first DCT compare with DCT/KWT and second D4 compare with D4/KWT are made. The work experimented over the image database of 200 images of 4 categories and the performance of image retrieval with respect to two similarity measures namely Euclidian distance (ED) and sum of absolute diff
... Show MoreHeart sound is an electric signal affected by some factors during the signal's recording process, which adds unwanted information to the signal. Recently, many studies have been interested in noise removal and signal recovery problems. The first step in signal processing is noise removal; many filters are used and proposed for treating this problem. Here, the Hankel matrix is implemented from a given signal and tries to clean the signal by overcoming unwanted information from the Hankel matrix. The first step is detecting unwanted information by defining a binary operator. This operator is defined under some threshold. The unwanted information replaces by zero, and the wanted information keeping in the estimated matrix. The resulting matrix
... Show More