Motifs template is the input for many bioinformatics systems such codons finding, transcription, transaction, sequential pattern miner, and bioinformatics databases analysis. The size of motifs arranged from one base up to several Mega bases, therefore, the typing errors increase according to the size of motifs. In addition, when the structures motifs are submitted to bioinformatics systems, the specifications of motifs components are required, i.e. the simple motifs, gaps, and the lower bound and upper bound of each gap. The motifs can be of DNA, RNA, or Protein. In this research, a motif parser and visualization module is designed depending on a proposed a context free grammar, CFG, and colors human recognition system. GFC describes the motif structure to parse the motifs, detect, debug the errors, and analyze the motifs template to its components. Many experiments are accomplished using motifs templates of various sizes arranged from 10 Kbase to 10 Mbase, various numbers of gaps arranged from 15 gaps to 15000 gaps, and different numbers of errors arranged from 100 errors to 1820 errors. The proposed systems, in all these experiments, exhibited linear behavior in parsing phase and visualization phase that indicates its scalability to motifs template sizes.
BN Rashid, AKF Jameel, Al- Ustath: Quarterly Scientific Journal, 2017 - Cited by 15
A Modified version of the Generlized standard addition method ( GSAM) was developed. This modified version was used for the quantitative determination of arginine (Arg) and glycine ( Gly) in arginine acetyl salicylate – glycine complex . According to this method two linear equations were solved to obtain the amounts of (Arg) and (Gly). The first equation was obtained by spectrophotometic measurement of the total absorbance of (Arg) and (Gly) colored complex with ninhydrin . The second equation was obtained by measuring the total acid consumed by total amino groups of (Arg) and ( Gly). The titration was carried out in non- aqueous media using perchloric acid in glacial acetic acid as a titrant. The developed metho
... Show MoreThe current study aims to identify the needs in the stories of the Brothers Grimm. The research sample consisted of (3) stories, namely: 1- The story of the Thorn Rose (Sleeping Beauty) 2- The story of Snow White 3- The story of Little Red Riding Hood. The number of pages analyzed reached (15.5) pages, and to achieve the research objectives, Murray's classification of needs was adopted, which contains (36) basic needs that are further divided into (129) sub-needs. The idea was adopted as a unit of analysis and repetition as a unit of enumeration, Reliability was extracted in two ways: 1- Agreement between the researcher and himself over time, where the agreement coefficient reached 97%. The second was agreement between the researcher and tw
... Show MoreLanguage as a means of communication has long been the concern of many conversation analysts in their studies such as: Sacks et al. (1974), Schegloff et al. (1977), Duncan (1972), Grice (1975) and Burton (1980). Burton has attempted analyzing the first ten transitions of the play “The Dumb Waiter” for mere a presentation of her approach. This paper aims at analyzing the conversational structure of forum on the subject of literary fiction and genre fiction by applying Burton’s model (1980) of analysis to answer the question to what extent this model is applicable in analyzing the presented text. The findings of the investigation have proved the applicability of the structure of conversation formulated by Burton (1980) in her model wit
... Show MoreMultilayer reservoirs are currently modeled as a single zone system by averaging the reservoir parameters associated with each reservoir zone. However, this type of modeling is rarely accurate because a single zone system does not account for the fact that each zone's pressure decreases independently. Pressure drop for each zone has an effect on the total output and would result in inter-flow and the premature depletion of one of the zones. Understanding reservoir performance requires a precise estimation of each layer's permeability and skin factor. The Multilayer Transient Analysis is a well-testing technique designed to determine formation properties in more than one layer, and its effectiveness over the past two decades has been
... Show MoreBN Rashid, Social Sciences, 2022
A new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever
... Show MoreThe main problem of the current study concentrates on applying critical discourse analysis to examine textual, discoursal and social features of reduplication in some selected English newspaper headlines. The main aim of the current study is to analyze the linguistic features of reduplication by adopting Fairclough's three-dimensional model (2001). This study sets forth the following hypotheses: (1) English headline – newspapers comprise various textual, discoursal and social features ;(2)the model of analysis is best suited for the current study.To achieve the aims and verify the hypotheses, a critical discourse analysis approach is used represented by Fairclough's socio-cultural approach (2001).The present study has examined the use of
... Show MoreWith the increasing rates of cancer worldwide, a great deal of scientific discourse is devoted to arguments and statements about cancer and its causes. Scientists from different fields try to seize any available chance to warn people of the risk of consuming and exposing to carcinogens that have, unfortunately, become essential parts of modern life. The present paper attempts to investigate the proximization strategy through which scientists construct carcinogen risk to enhance people’s preventive actions against these carcinogens. The paper targets the construction which depends on producing the conflict between the values of the people themselves and the contrasting values assigned to carcinogens. To achieve this aim, Cap’s (2
... Show MoreIn recent years, the migration of the computational workload to computational clouds has attracted intruders to target and exploit cloud networks internally and externally. The investigation of such hazardous network attacks in the cloud network requires comprehensive network forensics methods (NFM) to identify the source of the attack. However, cloud computing lacks NFM to identify the network attacks that affect various cloud resources by disseminating through cloud networks. In this paper, the study is motivated by the need to find the applicability of current (C-NFMs) for cloud networks of the cloud computing. The applicability is evaluated based on strengths, weaknesses, opportunities, and threats (SWOT) to outlook the cloud network. T
... Show More