The concept of the active contour model has been extensively utilized in the segmentation and analysis of images. This technology has been effectively employed in identifying the contours in object recognition, computer graphics and vision, biomedical processing of images that is normal images or medical images such as Magnetic Resonance Images (MRI), X-rays, plus Ultrasound imaging. Three colleagues, Kass, Witkin and Terzopoulos developed this energy, lessening “Active Contour Models” (equally identified as Snake) back in 1987. Being curved in nature, snakes are characterized in an image field and are capable of being set in motion by external and internal forces within image data and the curve itself in that order. The present study proposes the use of a hybrid image segmentation technique to acquire precise segmentation outcomes, while engaging “Alpha Shape (α-Shape)” in supposition to derive the original contour, followed by a refining process through engaging a conventional active contour model. Empirical results show high potential in the suggested computational method. Trials indicate that the primary contour is capable of being precisely set next to the objective contour and effectively have these objective contours extracted, devoid of any contour instigation. Some of the benefits associated with the novel hybrid contour include minimized cost of computation, enhanced anti-jamming capability, as well as enlarged utilization array of snake model.
A new, Simple, sensitive and accurate spectrophotometric methods have been developed for the determination of sulfanilamide (SNA) drug in pure and in synthetic sample. This method based on the reaction of sulfanilamide (SNA) with 1,2-napthoquinone-4-sulphonic acid (NQS) to form N-alkylamono naphthoquinone by replacement of the sulphonate group of the naphthoquinone sulphonic acid by an amino group. The colored chromogen shows absorption maximum at 455 nm. The optimum conditions of condensation reaction forms were investigated by: (1) univariable method, by optimizing the effect of experimental variables; (different bases, reagent concentration, borax concentration and reaction time), (2) central composite design (CCD) including
... Show MoreThe spectacular film is a type of feature films which has specific elements that contribute in increasing the aesthetics of the shape in its structure. The researcher started studying this type of films by researching the spectacular film concept, the history of its development, who are its most important stars and then tackling the Indian cinema represented by Bollywood, which is considered a school for this type of film. The researcher addressed the most important influential elements that entre in its production as well as studying these elements that contribute to building the shape including the configuration, movements of cameras, lenses, the lighting, colors, costumes etc. and what influence they have in forming a special aestheti
... Show MoreOne of the primary problems in internet is security, mostly when computer utilization is increasing in all social and business areas. So, the secret communications through public and private channels are the major goal of researchers. Information hiding is one of methods to obtain a security communication medium and protecting the data during transmission.
This research offers in a new method using two levels to hide, the first level is hiding by embedding and addition but the second level is hiding by injection. The first level embeds a secret message in one bit in the LSB in the FFT and the addition of one kashida. Subtraction of two random images (STRI) is RNG to find positions for hiding within the text. The second level is the in
This paper determined the difference between the first image of the natural and the second infected image by using logic gates. The proposed algorithm was applied in the first time with binary image, the second time in the gray image, and in the third time in the color image. At start of proposed algorithm the process images by applying convolution to extended images with zero to obtain more vision and features then enhancements images by Edge detection filter (laplacion operator) and smoothing images by using mean filter ,In order to determine the change between the original image and the injury the logic gates applied specially X-OR gates . Applying the technique for tooth decay through this comparison can locate inj
... Show MoreInternet technology has revolutionized the landscape of communication technologies in the modern era. However, because the internet is open to the public, communication security cannot be guaranteed. As a result, data concealment approaches have been developed to ensure confidential information sharing. Various methods have emerged to achieve the goal of secure data communication via multimedia documents. This study proposes a method, which is both adaptable and imperceptible, for concealing a secret text in a color image. From an adaptivity perspective, image corners are detected using the Harris corner detection algorithm and utilized as anchor points for picking the optimal hiding regions of interest using Bezier curve interp
... Show MoreThe research studied and analyzed the hybrid parallel-series systems of asymmetrical components by applying different experiments of simulations used to estimate the reliability function of those systems through the use of the maximum likelihood method as well as the Bayes standard method via both symmetrical and asymmetrical loss functions following Rayleigh distribution and Informative Prior distribution. The simulation experiments included different sizes of samples and default parameters which were then compared with one another depending on Square Error averages. Following that was the application of Bayes standard method by the Entropy Loss function that proved successful throughout the experimental side in finding the reliability fun
... Show MoreEstimating the semantic similarity between short texts plays an increasingly prominent role in many fields related to text mining and natural language processing applications, especially with the large increase in the volume of textual data that is produced daily. Traditional approaches for calculating the degree of similarity between two texts, based on the words they share, do not perform well with short texts because two similar texts may be written in different terms by employing synonyms. As a result, short texts should be semantically compared. In this paper, a semantic similarity measurement method between texts is presented which combines knowledge-based and corpus-based semantic information to build a semantic network that repre
... Show More