Pathology reports are necessary for specialists to make an appropriate diagnosis of diseases in general and blood diseases in particular. Therefore, specialists check blood cells and other blood details. Thus, to diagnose a disease, specialists must analyze the factors of the patient’s blood and medical history. Generally, doctors have tended to use intelligent agents to help them with CBC analysis. However, these agents need analytical tools to extract the parameters (CBC parameters) employed in the prediction of the development of life-threatening bacteremia and offer prognostic data. Therefore, this paper proposes an enhancement to the Rabin–Karp algorithm and then mixes it with the fuzzy ratio to make this algorithm suitable for working with CBC test data. The selection of these algorithms was performed after evaluating the utility of various string matching algorithms in order to choose the best ones to establish an accurate text collection tool to be a baseline for building a general report on patient information. The proposed method includes several basic steps: Firstly, the CBC-driven parameters are extracted using an efficient method for retrieving data information from pdf files or images of the CBC tests. This will be performed by implementing 12 traditional string matching algorithms, then finding the most effective ways based on the implementation results, and, subsequently, introducing a hybrid approach to address the shortcomings or issues in those methods to discover a more effective and faster algorithm to perform the analysis of the pathological tests. The proposed algorithm (Razy) was implemented using the Rabin algorithm and the fuzzy ratio method. The results show that the proposed algorithm is fast and efficient, with an average accuracy of 99.94% when retrieving the results. Moreover, we can conclude that the string matching algorithm is a crucial tool in the report analysis process that directly affects the efficiency of the analytical system.
BN Rashid
The low-pressure sprinklers have been widely used to replace the high-pressure impact sprinklers in the lateral move sprinkler irrigation system due to its low operating cost and high efficiency. However, runoff losses under the low-pressure sprinkler irrigation machine can be significant. This study aims to evaluate the performance of the variable pulsed irrigation algorithm (VPIA) in reducing the runoff losses under low-pressure lateral move sprinkler irrigation machine for three different soil types. The VPIA uses the ON-OFF pulsing technique to reduce the runoff losses by controlling the number and width of the pulses considering the soil and the irrigation machine properties. Als
Polymethylmethacrylate film (PMMA) of thickness 75 μm was evaluated Spectrophotometrically for using it as a low-doses gamma radiation dosimeter. The doses were examined in the range 0.1 mrad-10 krad. Within an absorption band of 200-400 nm, the irradiated films showed an increase in the absorption intensity with increasing the absorbed doses. Calibration curves for the changes in the absorption differences were obtained at 218, 301, and 343 nm. At 218 nm the response for the absorbed doses is a linear in the range 10 mrad- 10 krad. Hence it is recommended to be adopted as an environmental low doses dosimeter
A quadruped (four-legged) robot locomotion has the potential ability for using in different applications such as walking over soft and rough terrains and to grantee the mobility and flexibility. In general, quadruped robots have three main periodic gaits: creeping gait, running gait and galloping gait. The main problem of the quadruped robot during walking is the needing to be statically stable for slow gaits such as creeping gait. The statically stable walking as a condition depends on the stability margins that calculated particularly for this gait. In this paper, the creeping gait sequence analysis of each leg step during the swing and fixed phases has been carried out. The calculation of the minimum stability margins depends up
... Show MoreThe purpose of this paper is to statistically classify and categorize Building Information Modelling (BIM)-Facility Management (FM) publications in order to extract useful information related to the adoption and use of BIM in FM.
This study employs a quantitative approach using science mapping techniques to examine BIM-FM publications using Web of Science (WOS) database for the period between 2000 and April 2018.
The findi
Background:The referral system constitutes a key element of health system. Effective referral system between different levels of health care delivery represents a cornerstone in addressing patients’ health needs.
Objectives:To assess the referral system Baghdad/ Al-Rusafa Health Directorate by evaluation the referral pattern and identify the quality of the referral letters and feedback reports.
Type of the study: This cross-sectional study .
Methodology : It was conducted in5PHCC in from 1st July 2015 - 31st December 2015 at Bagdad/Al-Rusafa health directorate. The study population (sampled population) included all ref
... Show MoreThe main intention of this study was to investigate the development of a new optimization technique based on the differential evolution (DE) algorithm, for the purpose of linear frequency modulation radar signal de-noising. As the standard DE algorithm is a fixed length optimizer, it is not suitable for solving signal de-noising problems that call for variability. A modified crossover scheme called rand-length crossover was designed to fit the proposed variable-length DE, and the new DE algorithm is referred to as the random variable-length crossover differential evolution (rvlx-DE) algorithm. The measurement results demonstrate a highly efficient capability for target detection in terms of frequency response and peak forming that was isola
... Show MoreThe Internet is providing vital communications between millions of individuals. It is also more and more utilized as one of the commerce tools; thus, security is of high importance for securing communications and protecting vital information. Cryptography algorithms are essential in the field of security. Brute force attacks are the major Data Encryption Standard attacks. This is the main reason that warranted the need to use the improved structure of the Data Encryption Standard algorithm. This paper proposes a new, improved structure for Data Encryption Standard to make it secure and immune to attacks. The improved structure of Data Encryption Standard was accomplished using standard Data Encryption Standard with a new way of two key gene
... Show More