Tissue culture of Catharanthus roseus was established under many parameters to insure good results for detection of the alkaloids present in this plant . It was found that NItsch and Nitsch medium containing 8µM Benzyladeninpurine plus Naphalene acetic acid were the best and the callus of C.roseus left to grow in the dark and had much better influence for the production of Alkloids. The precursor phenylalanine showed a better result than other precursor( tryptophan ) . Abscisic acid has an inhibitory effect on the production of Alkaloid
98 samples were collected from various clinical sources included (Burns, wounds, urines, sputums, blood) From the city of Baghdad, After performing the biochemical and microscopic examination, 52 isolates were obtained for Pseudomonas aeruginosa, 17 (32.7%) isolates from burn infection, 12 (23%) isolates from Wound infection 11 (21.2%) isolates from urine infection, 7 (13.5%) isolates of sputum and 5 (9.6%) isolates from blood. Bacteria susceptibility to form biofilm has been detectedby microtiter plate method, The results showed that 80% of the bacterial isolates were produced the biofilm with different proportions, alg D gene (alginate production) has been detected by polymerase chain reaction (PCR) Which plays an essential role in the fo
... Show MoreThis paper reports a fiber Bragg grating (FBG) as a biosensor. The FBGs were etched using a chemical agent,namely,hydrofluoric acid (HF). This implies the removal of some part of the cladding layer. Consequently, the evanescent field propagating out of the core will be closer to the environment and become more sensitive to the change in the surrounding. The proposed FBG sensor was utilized to detect toxic heavy metal ions aqueous medium namely, copper ions (Cu2+). Two FBG sensors were etched with 20 and 40 μm diameters and fabricated. The sensors were studied towards Cu2+ with different concentrations using wavelength shift as a result of the interaction between the evanescent field and copper ions. The FBG sensors showed
... Show MoreIn the task of detecting intrinsic plagiarism, the cases where reference corpus is absent are to be dealt with. This task is entirely based on inconsistencies within a given document. Detection of internal plagiarism has been considered as a classification problem. It can be estimated through taking into consideration self-based information from a given document.
The core contribution of the work proposed in this paper is associated with the document representation. Wherein, the document, also, the disjoint segments generated from it, have been represented as weight vectors demonstrating their main content. Where, for each element in these vectors, its average weight has been considered instead of its frequency.
Th
... Show MorePlagiarism is described as using someone else's ideas or work without their permission. Using lexical and semantic text similarity notions, this paper presents a plagiarism detection system for examining suspicious texts against available sources on the Web. The user can upload suspicious files in pdf or docx formats. The system will search three popular search engines for the source text (Google, Bing, and Yahoo) and try to identify the top five results for each search engine on the first retrieved page. The corpus is made up of the downloaded files and scraped web page text of the search engines' results. The corpus text and suspicious documents will then be encoded as vectors. For lexical plagiarism detection, the system will
... Show MoreIn this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant features from
... Show MoreIn this paper, the botnet detection problem is defined as a feature selection problem and the genetic algorithm (GA) is used to search for the best significant combination of features from the entire search space of set of features. Furthermore, the Decision Tree (DT) classifier is used as an objective function to direct the ability of the proposed GA to locate the combination of features that can correctly classify the activities into normal traffics and botnet attacks. Two datasets namely the UNSW-NB15 and the Canadian Institute for Cybersecurity Intrusion Detection System 2017 (CICIDS2017), are used as evaluation datasets. The results reveal that the proposed DT-aware GA can effectively find the relevant
... Show More