Identifying people by their ear has recently received import attention in the literature. The accurate segmentation of the ear region is vital in order to make successful person identification decisions. This paper presents an effective approach for ear region segmentation from color ear images. Firstly, the RGB color model was converted to the HSV color model. Secondly, thresholding was utilized to segment the ear region. Finally, the morphological operations were applied to remove small islands and fill the gaps. The proposed method was tested on a database which consisted of 105 ear images taken from the right sides of 105 subjects. The experimental results of the proposed approach on a variety of ear images revealed that this approac
... Show MoreContours extraction from two dimensional echocardiographic images has been a challenge in digital image processing. This is essentially due to the heavy noise, poor quality of these images and some artifacts like papillary muscles, intra-cavity structures as chordate, and valves that can interfere with the endocardial border tracking. In this paper, we will present a technique to extract the contours of heart boundaries from a sequence of echocardiographic images, where it started with pre-processing to reduce noise and produce better image quality. By pre-processing the images, the unclear edges are avoided, and we can get an accurate detection of both heart boundary and movement of heart valves.
Texture synthesis using genetic algorithms is one way; proposed in the previous research, to synthesis texture in a fast and easy way. In genetic texture synthesis algorithms ,the chromosome consist of random blocks selected manually by the user .However ,this method of selection is highly dependent on the experience of user .Hence, wrong selection of blocks will greatly affect the synthesized texture result. In this paper a new method is suggested for selecting the blocks automatically without the participation of user .The results show that this method of selection eliminates some blending caused from the previous manual method of selection.
Ultrasound imaging is often preferred over other medical imaging modalities because it is non-invasive, non-ionizing, and low-cost. However, the main weakness of medical ultrasound image is the poor quality of images, due to presence of speckle noise and blurring. Speckle is characteristic phenomenon in ultrasound images, which can be described as random multiplicative noise that occurrence is often undesirable, since it affects the tasks of human interpretation and diagnosis. Blurring is a form of bandwidth reduction of an ideal image owing to the imperfect image formation process. Image denoising involves processing of the image data to produce a visually high quality image. The denoising algorithms may be classified into two categorie
... Show MoreDeep learning (DL) plays a significant role in several tasks, especially classification and prediction. Classification tasks can be efficiently achieved via convolutional neural networks (CNN) with a huge dataset, while recurrent neural networks (RNN) can perform prediction tasks due to their ability to remember time series data. In this paper, three models have been proposed to certify the evaluation track for classification and prediction tasks associated with four datasets (two for each task). These models are CNN and RNN, which include two models (Long Short Term Memory (LSTM)) and GRU (Gated Recurrent Unit). Each model is employed to work consequently over the two mentioned tasks to draw a road map of deep learning mod
... Show MoreText documents are unstructured and high dimensional. Effective feature selection is required to select the most important and significant feature from the sparse feature space. Thus, this paper proposed an embedded feature selection technique based on Term Frequency-Inverse Document Frequency (TF-IDF) and Support Vector Machine-Recursive Feature Elimination (SVM-RFE) for unstructured and high dimensional text classificationhis technique has the ability to measure the feature’s importance in a high-dimensional text document. In addition, it aims to increase the efficiency of the feature selection. Hence, obtaining a promising text classification accuracy. TF-IDF act as a filter approach which measures features importance of the te
... Show MoreA -set in the projective line is a set of projectively distinct points. From the fundamental theorem over the projective line, all -sets are projectively equivalent. In this research, the inequivalent -sets in have been computed and each -set classified to its -sets where Also, the has been splitting into two distinct -sets, equivalent and inequivalent.
Analysis of image content is important in the classification of images, identification, retrieval, and recognition processes. The medical image datasets for content-based medical image retrieval ( are large datasets that are limited by high computational costs and poor performance. The aim of the proposed method is to enhance this image retrieval and classification by using a genetic algorithm (GA) to choose the reduced features and dimensionality. This process was created in three stages. In the first stage, two algorithms are applied to extract the important features; the first algorithm is the Contrast Enhancement method and the second is a Discrete Cosine Transform algorithm. In the next stage, we used datasets of the medi
... Show MoreEat this research study features Technical Ceramics Islamic and Chinese The study of four chapters , such as the first chapter the general framework for research and containing the problem that put the following question: Mamdy effect features art on porcelain Islamic and Chinese ) , and whether there are dimensions of the aesthetic , intellectual and ideological in porcelain Islamic and Chinese with lies the importance of research in the promise of a qualitative study and add a scientific theme features art in porcelain Islamic and China , and the objectives of this study One was in the detection of features technical Ceramics Islamic and Chinese study examined the length of time the ninth century AD , and the tenth century AD , and in
... Show More