Skin cancer is the most serious health problems in the globe because of its high occurrence compared to other types of cancer. Melanoma and non-melanoma are the two most common kinds of skin cancer. One of the most difficult problems in medical image processing is the automatic detection of skin cancer. Skin melanoma is classified as either benign or malignant based on the results of this test. Impediment due to artifacts in dermoscopic images impacts the analytic activity and decreases the precision level. In this research work, an automatic technique including segmentation and classification is proposed. Initially, pre-processing technique called DullRazor tool is used for hair removal process and semi-supervised mean-shift algorithm is used for segmenting the affected areas of skin cancer images. Finally, these segmented images are given to a deep learning classifier called Deep forest for prediction of skin cancer. The experiments are carried out on two publicly available datasets called ISIC-2019 and HAM10000 datasets for the analysis of segmentation and classification. From the outcomes, it is clearly verified that the projected model achieved better performance than the existing deep learning techniques.
Nowadays, the power plant is changing the power industry from a centralized and vertically integrated form into regional, competitive and functionally separate units. This is done with the future aims of increasing efficiency by better management and better employment of existing equipment and lower price of electricity to all types of customers while retaining a reliable system. This research is aimed to solve the optimal power flow (OPF) problem. The OPF is used to minimize the total generations fuel cost function. Optimal power flow may be single objective or multi objective function. In this thesis, an attempt is made to minimize the objective function with keeping the voltages magnitudes of all load buses, real outp
... Show MoreIn the present research we the study the deposition of radioactive elements naturally and particularly radioactive radon gas in parts of the body of organisms which are of direct relevance to human life in the city of Baghdad as the samples which were collected from the bones and skin of some kinds of birds and chicken based on the principle that radioactive elements are concentrated always on the bones. We use of this as the exercise detector impact nuclear (CR-39), using the technology Cylindrical diffusion , the results indicated that the largest concentration of radon found in the bone bird Seagull tapered as it was (625 ± 37) Bq.cm-3, and less concentration of radon gas in the chicken bones of Al-kafeel as it was (105 ± 10) Bq.c
... Show MoreBackground: - Recurrent breast cancer is cancer that comes back following initial treatment. Risk factors of recurrence are lymph node involvement, larger tumor size, positive or close tumor margins, and lack of radiation treatment following lumpectomy, younger age and inflammatory breast cancer.
Objective: Asses the rate of recurrence for early breast cancer in Iraqi female patients, in relation to certain risk factors.
Patients and methods: A prospective study was conducted on 100 consecutive female patients, with stage I and stage II breast cancer treated by mastectomy and axillary dissection by the same team. Patients were assessed postoperatively every three months and recurrences were detected by physical examination and ultr
Nowadays, after the technological development in societies, cloud computing has become one of the most important technologies. It provides users with software, hardware, and platform as remote services over the Internet. The increasing number of cloud users has caused a critical problem in how the clients receive cloud services when the cloud is in a state of instability, as it cannot provide required services and, thus, a delay occurs. Therefore, an algorithm was proposed to provide high efficiency and stability to work, because all existing tasks must operate without delay. The proposed system is an enhancement shortest job first algorithm (ESJF) using a time slice, which works by taking a task in the shortest time first and then the l
... Show MoreSolid waste is a major issue in today's world. Which can be a contributing factor to pollution and the spread of vector-borne diseases. Because of its complicated nonlinear processes, this problem is difficult to model and optimize using traditional methods. In this study, a mathematical model was developed to optimize the cost of solid waste recycling and management. In the optimization phase, the salp swarm algorithm (SSA) is utilized to determine the level of discarded solid waste and reclaimed solid waste. An optimization technique SSA is a new method of finding the ideal solution for a mathematical relationship based on leaders and followers. It takes a lot of random solutions, as well as their outward or inward fluctuations, t
... Show MoreThe swarm intelligence and evolutionary methods are commonly utilized by researchers in solving the difficult combinatorial and Non-Deterministic Polynomial (NP) problems. The N-Queen problem can be defined as a combinatorial problem that became intractable for the large ‘n’ values and, thereby, it is placed in the NP class of problems. In the present study, a solution is suggested for the N-Queen problem, on the basis of the Meerkat Clan Algorithm (MCA). The problem of n-Queen can be mainly defined as one of the generalized 8-Queen problem forms, for which the aim is placing 8 queens in a way that none of the queens has the ability of killing the others with the use of the standard moves of the chess queen. The Meerkat Clan environm
... Show MoreAn Optimal Algorithm for HTML Page Building Process
Krawtchouk polynomials (KPs) and their moments are promising techniques for applications of information theory, coding theory, and signal processing. This is due to the special capabilities of KPs in feature extraction and classification processes. The main challenge in existing KPs recurrence algorithms is that of numerical errors, which occur during the computation of the coefficients in large polynomial sizes, particularly when the KP parameter (p) values deviate away from 0.5 to 0 and 1. To this end, this paper proposes a new recurrence relation in order to compute the coefficients of KPs in high orders. In particular, this paper discusses the development of a new algorithm and presents a new mathematical model for computing the
... Show More