Considering the expanding frequency of breast cancer and high incidence of vitamin D3 [25(OH)D3] insufficiently, this investigate pointed to explain a relation between serum [25(OH)D3] (the sunshine vitamin) level and breast cancer hazard. The current study aimed to see how serum levels of each [25(OH)D3], HbA1c%, total cholesterol (TC), high density lipoprotein cholesterol (HDL-C), low density lipoprotein cholesterol (LDL-C), and triglyceride (TG) were affected a woman’s risk of getting breast cancer. In 40 healthy volunteers and 69 untreated breast cancer patients with clinical and histological evidence which include outpatients and hospitalized admissions patients at the Oncology Center, Medical City / Baghdad - Iraq. Venous blood samples were withdrawn from breast cancer patients and healthy women between June – Dec., 2020. Serum lipid profiles including [total cholesterol (TC), high-density lipoprotein-cholesterol (HDL-C), low-density lipoprotein-cholesterol (LDL-C), triglycerides (TG)], [25(OH)D3] and HbA1c% were determined under aseptic conditions using a standard fully automatic Chemistry Analyzer. The independent sample t-test was used in the statistical analysis to compare the mean serum levels of lipid profile and other biomarkers determined between breast cancer patients and compared with apparently healthy women. TG, TC, LDL-C, and HDL-C levels are 190. 7 mg/dl, 104.3 mg/dl, 108.1 mg/dl, and 53 mg/dl, respectively. Breast cancer caused a significant increase in serum TC, LDL-C, and TG while decreasing HDL-C significantly. Breast cancer development could be one of the factors causing changes in Vit. D, HbA1c, and lipid profile levels.
Image recognition is one of the most important applications of information processing, in this paper; a comparison between 3-level techniques based image recognition has been achieved, using discrete wavelet (DWT) and stationary wavelet transforms (SWT), stationary-stationary-stationary (sss), stationary-stationary-wavelet (ssw), stationary-wavelet-stationary (sws), stationary-wavelet-wavelet (sww), wavelet-stationary- stationary (wss), wavelet-stationary-wavelet (wsw), wavelet-wavelet-stationary (wws) and wavelet-wavelet-wavelet (www). A comparison between these techniques has been implemented. according to the peak signal to noise ratio (PSNR), root mean square error (RMSE), compression ratio (CR) and the coding noise e (n) of each third
... Show MoreAdvances in digital technology and the World Wide Web has led to the increase of digital documents that are used for various purposes such as publishing and digital library. This phenomenon raises awareness for the requirement of effective techniques that can help during the search and retrieval of text. One of the most needed tasks is clustering, which categorizes documents automatically into meaningful groups. Clustering is an important task in data mining and machine learning. The accuracy of clustering depends tightly on the selection of the text representation method. Traditional methods of text representation model documents as bags of words using term-frequency index document frequency (TFIDF). This method ignores the relationship an
... Show MoreThe feature extraction step plays major role for proper object classification and recognition, this step depends mainly on correct object detection in the given scene, the object detection algorithms may result with some noises that affect the final object shape, a novel approach is introduced in this paper for filling the holes in that object for better object detection and for correct feature extraction, this method is based on the hole definition which is the black pixel surrounded by a connected boundary region, and hence trying to find a connected contour region that surrounds the background pixel using roadmap racing algorithm, the method shows a good results in 2D space objects.
Keywords: object filling, object detection, objec
The agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation plat
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show MoreVol. 6, Issue 1 (2025)
For businesses that provide delivery services, the efficiency of the delivery process in terms of punctuality is very important. In addition to increasing customer trust, efficient route management, and selection are required to reduce vehicle fuel costs and expedite delivery. Some small and medium businesses still use conventional methods to manage delivery routes. Decisions to manage delivery schedules and routes do not use any specific methods to expedite the delivery settlement process. This process is inefficient, takes a long time, increases costs and is prone to errors. Therefore, the Dijkstra algorithm has been used to improve the delivery management process. A delivery management system was developed to help managers and drivers
... Show MoreA security system can be defined as a method of providing a form of protection to any type of data. A sequential process must be performed in most of the security systems in order to achieve good protection. Authentication can be defined as a part of such sequential processes, which is utilized in order to verify the user permission to entree and utilize the system. There are several kinds of methods utilized, including knowledge, and biometric features. The electroencephalograph (EEG) signal is one of the most widely signal used in the bioinformatics field. EEG has five major wave patterns, which are Delta, Theta, Alpha, Beta and Gamma. Every wave has five features which are amplitude, wavelength, period, speed and frequency. The linear
... Show More