Melanoma, a highly malignant form of skin cancer, affects individuals of all genders and is associated with high mortality rates, especially in advanced stages. The use of tele-dermatology has emerged as a proficient diagnostic approach for skin lesions and is particularly beneficial in rural areas with limited access to dermatologists. However, accurately, and efficiently segmenting melanoma remains a challenging task due to the significant diversity observed in the morphology, pigmentation, and dimensions of cutaneous nevi. To address this challenge, we propose a novel approach called DenseUNet-169 with a dilated convolution encoder-decoder for automatic segmentation of RGB dermascopic images. By incorporating dilated convolution, our model improves the receptive field of the kernels without increasing the number of parameters. Additionally, we used a method called Copy and Concatenation Attention Block (CCAB) for robust feature computation. To evaluate the performance of our proposed framework, we utilized the International Skin Imaging Collaboration (ISIC) 2017 dataset. The experimental results demonstrate the reliability and effectiveness of our suggested approach compared to existing methodologies. Our framework achieved a high level of accuracy (98.38%), precision (96.07%), recall (94.32%), dice score (95.07%), and Jaccard score (90.45%), outperforming current techniques.
In this research, Fuzzy Analytic Hierarchy Process technique is applied (Fuzzy AHP) which is one of multi-criteria decision making techniques to evaluate the criteria for urban planning projects, the project of developing master plan of Al-Muqdadiyah city to 2035 has been chosen as a case study. The researcher prepared a list of criteria in addition to the authorized departments criteria and previous researches in order to choose optimized master plan according to these criteria. This research aims at employing the foundations of (Fuzzy AHP) technique in evaluating urban planning criteria precisely and flexible. The results of the data analysis to the individuals of the sample who are specialists, in this aspect. The la
... Show MoreThe process of accurate localization of the basic components of human faces (i.e., eyebrows, eyes, nose, mouth, etc.) from images is an important step in face processing techniques like face tracking, facial expression recognition or face recognition. However, it is a challenging task due to the variations in scale, orientation, pose, facial expressions, partial occlusions and lighting conditions. In the current paper, a scheme includes the method of three-hierarchal stages for facial components extraction is presented; it works regardless of illumination variance. Adaptive linear contrast enhancement methods like gamma correction and contrast stretching are used to simulate the variance in light condition among images. As testing material
... Show MoreThis paper proposes a novel method for generating True Random Numbers (TRNs) using electromechanical switches. The proposed generator is implemented using an FPGA board. The system utilizes the phenomenon of electromechanical switch bounce to produce a randomly fluctuated signal that is used to trigger a counter to generate a binary random number. Compared to other true random number generation methods, the proposed approach features a high degree of randomness using a simple circuit that can be easily built using off-the-shelf components. The proposed system is implemented using a commercial relay circuit connected to an FPGA board that is used to process and record the generated random sequences. Applying statistical testing on th
... Show MoreTime series have gained great importance and have been applied in a manner in the economic, financial, health and social fields and used in the analysis through studying the changes and forecasting the future of the phenomenon. One of the most important models of the black box is the "ARMAX" model, which is a mixed model consisting of self-regression with moving averages with external inputs. It consists of several stages, namely determining the rank of the model and the process of estimating the parameters of the model and then the prediction process to know the amount of compensation granted to workers in the future in order to fulfil the future obligations of the Fund. , And using the regular least squares method and the frequ
... Show MoreWater scarcity is one of the most important problems facing humanity in various fields such as economics, industry, agriculture, and tourism. This may push people to use low-quality water like industrial-wastewater. The application of some chemical compounds to get rid of heavy metals such as cadmium is an environmentally harmful approach. It is well-known that heavy metals as cadmium may induce harmful problems when present in water and invade to soil, plants and food chain of a human being. In this case, man will be forced to use the low quality water in irrigation. Application of natural materials instead of chemicals to remove cadmium from polluted water is an environmental friendly approach. Attention was drawn in this research wor
... Show MoreAbstract:
In this research we discussed the parameter estimation and variable selection in Tobit quantile regression model in present of multicollinearity problem. We used elastic net technique as an important technique for dealing with both multicollinearity and variable selection. Depending on the data we proposed Bayesian Tobit hierarchical model with four level prior distributions . We assumed both tuning parameter are random variable and estimated them with the other unknown parameter in the model .Simulation study was used for explain the efficiency of the proposed method and then we compared our approach with (Alhamzwi 2014 & standard QR) .The result illustrated that our approach
... Show MoreThis article aims to estimate the partially linear model by using two methods, which are the Wavelet and Kernel Smoothers. Simulation experiments are used to study the small sample behavior depending on different functions, sample sizes, and variances. Results explained that the wavelet smoother is the best depending on the mean average squares error criterion for all cases that used.