Doppler broadening technique is suggested to monitor the development of tumours. It depends on the sensitivity of positronium (Ps) annihilation parameters to the sub- microstructural changes in biological tissues. This technique uses high resolution HpGe detector to measure the lineshape parameters (S and W) in normal mice's mammary tissues and adenocarcinoma mammary tissues as a function of tumour growth. The results demonstrate that the central parameter (S) decreases and the wing parameter (W) increases as the tumour grow. It is found that the S parameter changes considerably with the distribution of voids which are affected by the tumour development. Therefore the present technique can successfully be employed to monitor the development of tumours
G-system composed of three isolates G3 ( Bacillus),G12 ( Arthrobacter )and G27 ( Brevibacterium) was used to detect the mutagenicity of the anticancer drug, cyclophosphamide (CP) under conditions similar to that used for standard mutagen, Nitrosoguanidine (NTG). The CP effected the survival fraction of isolates after treatment for 15 mins using gradual increasing concentrations, but at less extent comparing to NTG. The mutagenic effect of CP was at higher level than that of NTG when using streptomycin as a genetic marker, but the situation was reversed when using rifampicin resistant as a report marker. The latter effect appeared upon recording the mutagen efficiency (ie., number of induced mutants/microgram of mutagen). Measuring the R
... Show MoreThe synthesized ligand (3-(2-amino-5-(3,4,5-tri-methoxybenzyl)pyrimidin-4-ylamino)-5,5-dimethylcyclohex-2-enone] [H1L1] was characterized via fourier transform infrared spectroscopy (FTIR), 1H, 13C – NMR, Mass spectra, (CHN analysis), UV-vis spectroscopic approaches. Analytical and spectroscopic techniques like chloride content, micro-analysis, magnetic susceptibility UV-visible, conductance, and FTIR spectra were used to identify mixed ligand complexes. Its (ML13ph) mixed ligand complexes [M= Co (II), Ni (II), Cu (II), Zn (II), and Cd (II); (H1L1) = β-enaminone ligand=L1 and (3ph) =3-aminophenol= L2]. The results demonstrate that the complexes are produced with a molar ratio of M: L1:L2 (1:1:1). To generate the appropriate compl
... Show MoreThis research describes a new model inspired by Mobilenetv2 that was trained on a very diverse dataset. The goal is to enable fire detection in open areas to replace physical sensor-based fire detectors and reduce false alarms of fires, to achieve the lowest losses in open areas via deep learning. A diverse fire dataset was created that combines images and videos from several sources. In addition, another self-made data set was taken from the farms of the holy shrine of Al-Hussainiya in the city of Karbala. After that, the model was trained with the collected dataset. The test accuracy of the fire dataset that was trained with the new model reached 98.87%.
With the development of communication technologies for mobile devices and electronic communications, and went to the world of e-government, e-commerce and e-banking. It became necessary to control these activities from exposure to intrusion or misuse and to provide protection to them, so it's important to design powerful and efficient systems-do-this-purpose. It this paper it has been used several varieties of algorithm selection passive immune algorithm selection passive with real values, algorithm selection with passive detectors with a radius fixed, algorithm selection with passive detectors, variable- sized intrusion detection network type misuse where the algorithm generates a set of detectors to distinguish the self-samples. Practica
... Show MorePlagiarism is becoming more of a problem in academics. It’s made worse by the ease with which a wide range of resources can be found on the internet, as well as the ease with which they can be copied and pasted. It is academic theft since the perpetrator has ”taken” and presented the work of others as his or her own. Manual detection of plagiarism by a human being is difficult, imprecise, and time-consuming because it is difficult for anyone to compare their work to current data. Plagiarism is a big problem in higher education, and it can happen on any topic. Plagiarism detection has been studied in many scientific articles, and methods for recognition have been created utilizing the Plagiarism analysis, Authorship identification, and
... Show MoreGlaucoma is a visual disorder, which is one of the significant driving reason for visual impairment. Glaucoma leads to frustrate the visual information transmission to the brain. Dissimilar to other eye illness such as myopia and cataracts. The impact of glaucoma can’t be cured; The Disc Damage Likelihood Scale (DDLS) can be used to assess the Glaucoma. The proposed methodology suggested simple method to extract Neuroretinal rim (NRM) region then dividing the region into four sectors after that calculate the width for each sector and select the minimum value to use it in DDLS factor. The feature was fed to the SVM classification algorithm, the DDLS successfully classified Glaucoma d
HM Al-Dabbas, RA Azeez, AE Ali, Iraqi Journal of Science, 2023
Individuals across different industries, including but not limited to agriculture, drones, pharmaceuticals and manufacturing, are increasingly using thermal cameras to achieve various safety and security goals. This widespread adoption is made possible by advancements in thermal imaging sensor technology. The current literature provides an in-depth exploration of thermography camera applications for detecting faults in sectors such as fire protection, manufacturing, aerospace, automotive, non-destructive testing and structural material industries. The current discussion builds on previous studies, emphasising the effectiveness of thermography cameras in distinguishing undetectable defects by the human eye. Various methods for defect
... Show MoreFor several applications, it is very important to have an edge detection technique matching human visual contour perception and less sensitive to noise. The edge detection algorithm describes in this paper based on the results obtained by Maximum a posteriori (MAP) and Maximum Entropy (ME) deblurring algorithms. The technique makes a trade-off between sharpening and smoothing the noisy image. One of the advantages of the described algorithm is less sensitive to noise than that given by Marr and Geuen techniques that considered to be the best edge detection algorithms in terms of matching human visual contour perception.