Kidney tumors are of different types having different characteristics and also remain challenging in the field of biomedicine. It becomes very important to detect the tumor and classify it at the early stage so that appropriate treatment can be planned. Accurate estimation of kidney tumor volume is essential for clinical diagnoses and therapeutic decisions related to renal diseases. The main objective of this research is to use the Computer-Aided Diagnosis (CAD) algorithms to help the early detection of kidney tumors that addresses the challenges of accurate kidney tumor volume estimation caused by extensive variations in kidney shape, size and orientation across subjects.
In this paper, have tried to implement an automated segmentation method of gray level CT images. The segmentation process is performed by using the Fuzzy C-Means (FCM) clustering method to detect and segment kidney CT images for the kidney region. The propose method is started with pre-processing of the kidney CT image to separate the kidney from the abdomen CT and to enhance its contrast and removing the undesired noise in order to make the image suitable for further processing. The resulted segmented CT images, then used to extract the tumor region from kidney image defining the tumor volume (size) is not an easy task, because the 2D tumor shape in the CT slices are not regular. To overcome the problem of calculating the area of the convex shape of the hull of the tumor in each slice, we have used the Frustum model for the fragmented data.
KE Sharquie, SA Al-Mashhadani, AA Noaimi, MY Abbas, Journal of the Saudi Society of Dermatology & Dermatologic Surgery, 2011 - Cited by 5
The floating ice shelves around Antarctica, which buttress ice streams from the continent and slow their discharge into the sea, are thinning at faster rates. Paolo
Grain size and shape are important yield indicators. A hint for reexamining the visual markers of grain weight can be found in the wheat grain width. A digital vernier caliper is used to measure length, width, and thickness. The data consisted of 1296 wheat grains, with measurements for each grain. In this data set, the average weight (We) of the twenty-four grains was measured and recorded. To determine measure of the length (L), width (W), thickness (T), weight (We), and volume(V). These features were manipulated to develop two mathematical models that were passed on to the multiple regression models. The results of the weight model demonstrated that the length and width of the grai
This research adopts the estimation of mass transfer coefficient in batch packed bed distillation column as function of physical properties, liquid to vapour molar rates ratio (L / V), relative volatility (α), ratio of vapour and liquid diffusivities (DV / DL), ratio of vapour and liquid densities (ρV / ρL), ratio of vapour and liquid viscosities (μV/ μL).
The experiments are done using binary systems, (Ethanol Water), (Methanol Water), (Methanol Ethanol), (Benzene Hexane), (Benzene Toluene). Statistical program (multiple regression analysis) is used for estimating the overall mass transfer coefficient of vapour and liquid phases (KOV and KOL) in a correlation which represented the data fairly well.
KOV = 3.3 * 10-10
... Show MoreIndividuals across different industries, including but not limited to agriculture, drones, pharmaceuticals and manufacturing, are increasingly using thermal cameras to achieve various safety and security goals. This widespread adoption is made possible by advancements in thermal imaging sensor technology. The current literature provides an in-depth exploration of thermography camera applications for detecting faults in sectors such as fire protection, manufacturing, aerospace, automotive, non-destructive testing and structural material industries. The current discussion builds on previous studies, emphasising the effectiveness of thermography cameras in distinguishing undetectable defects by the human eye. Various methods for defect
... Show MoreIn this research, an analysis for the standard Hueckel edge detection algorithm behaviour by using three dimensional representations for the edge goodness criterion is presents after applying it on a real high texture satellite image, where the edge goodness criterion is analysis statistically. The Hueckel edge detection algorithm showed a forward exponential relationship between the execution time with the used disk radius. Hueckel restrictions that mentioned in his papers are adopted in this research. A discussion for the resultant edge shape and malformation is presented, since this is the first practical study of applying Hueckel edge detection algorithm on a real high texture image containing ramp edges (satellite image).
The internet has been a source of medical information, it has been used for online medical consultation (OMC). OMC is now offered by many providers internationally with diverse models and features. In OMC, consultations and treatments are available 24/7. The covid-19 pandemic across-the-board, many people unable to go to hospital or clinic because the spread of the virus. This paper tried to answer two research questions. The first one on how the OMC can help the patients during covid-19 pandemic. A literature review was conducted to answer the first research question. The second one on how to develop system in OMC related to covid-19 pandemic. The system was developed by Visual Studio 2019 using software object-oriented approach. O
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r
... Show MoreAs a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand
... Show More