Developing an efficient algorithm for automated Magnetic Resonance Imaging (MRI) segmentation to characterize tumor abnormalities in an accurate and reproducible manner is ever demanding. This paper presents an overview of the recent development and challenges of the energy minimizing active contour segmentation model called snake for the MRI. This model is successfully used in contour detection for object recognition, computer vision and graphics as well as biomedical image processing including X-ray, MRI and Ultrasound images. Snakes being deformable well-defined curves in the image domain can move under the influence of internal forces and external forces are subsequently derived from the image data. We underscore a critical appraisal of the current status of semi-automated and automated methods for the segmentation of MR images with important issues and terminologies. Advantages and disadvantages of various segmentation methods with salient features and their relevancies are also cited.
Many patients with advanced type 2 diabetes mellitus (T2DM) and all patients with T1DM require insulin to keep blood glucose levels in the target range. The most common route of insulin administration is subcutaneous insulin injections. There are many ways to deliver insulin subcutaneously, such as vials and syringes, insulin pens, and insulin pumps. Though subcutaneous insulin delivery is the standard route of insulin administration, it is associated with injection pain, needle phobia, lipodystrophy, noncompliance, and peripheral hyperinsulinemia. Therefore, the need exists to deliver insulin in a minimally invasive or noninvasive way and in the most physiological way. Inhaled insulin was the first approved noninvasive and alternative way
... Show MoreGiven the importance of ecology and its entry into various fields in general and the urban environment particularly; ecological cities take wide ranges of application at multiple regional and global levels. However, it repeatedly noted that there was a state of cognitive confusion and overlapping in the term ecology comes from the diversity of implementation within several disciplines. Architects, designers, and planners have instilled biological development directly into the formal principles as well as the social structures of the ecological cities. Therefore, the research presents a rapid review of the most relevant areas that dealt with the ecological cities by research and analysis at various levels, from the concept and definition of
... Show MoreGiven the importance of ecology and its entry into various fields in general and the urban environment particularly; ecological cities take wide ranges of application at multiple regional and global levels. However, it repeatedly noted that there was a state of cognitive confusion and overlapping in the term ecology comes from the diversity of implementation within several disciplines. Architects, designers, and planners have instilled biological development directly into the formal principles as well as the social structures of the ecological cities. Therefore, the research presents a rapid review of the most relevant areas that dealt with the ecological cities by research and analys
Objective (s): To assess the QoL of children age from (8- lessthan13) years with acute lymphocytic leukemia undergoing chemotherapy and to find out the relationship between the QoL of children with acute lymphocytic leukemia and their illness history.
Methodology: A descriptive study included (40) children with acute lymphocytic leukemia who were ranged between (8 - less than 13 years) at the Hematology Center in Medical City for the period from 4th March 2021 to 1st September 2021. The sample was non-probability (purposive) sample of children (male and female). A questionnaire designed with 2 main parts was used. The first part focused on sociodemographic characterist
... Show MoreThe support vector machine, also known as SVM, is a type of supervised learning model that can be used for classification or regression depending on the datasets. SVM is used to classify data points by determining the best hyperplane between two or more groups. Working with enormous datasets, on the other hand, might result in a variety of issues, including inefficient accuracy and time-consuming. SVM was updated in this research by applying some non-linear kernel transformations, which are: linear, polynomial, radial basis, and multi-layer kernels. The non-linear SVM classification model was illustrated and summarized in an algorithm using kernel tricks. The proposed method was examined using three simulation datasets with different sample
... Show MoreIn this paper, a fusion of K models of full-rank weighted nonnegative tensor factor two-dimensional deconvolution (K-wNTF2D) is proposed to separate the acoustic sources that have been mixed in an underdetermined reverberant environment. The model is adapted in an unsupervised manner under the hybrid framework of the generalized expectation maximization and multiplicative update algorithms. The derivation of the algorithm and the development of proposed full-rank K-wNTF2D will be shown. The algorithm also encodes a set of variable sparsity parameters derived from Gibbs distribution into the K-wNTF2D model. This optimizes each sub-model in K-wNTF2D with the required sparsity to model the time-varying variances of the sources in the s
... Show MoreRecently, wireless communication environments with high speeds and low complexity have become increasingly essential. Free-space optics (FSO) has emerged as a promising solution for providing direct connections between devices in such high-spectrum wireless setups. However, FSO communications are susceptible to weather-induced signal fluctuations, leading to fading and signal weakness at the receiver. To mitigate the effects of these challenges, several mathematical models have been proposed to describe the transition from weak to strong atmospheric turbulence, including Rayleigh, lognormal, Málaga, Nakagami-m, K-distribution, Weibull, Negative-Exponential, Inverse-Gaussian, G-G, and Fisher-Snedecor F distributions. This paper extensive
... Show MoreIn this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights