Pathology reports are necessary for specialists to make an appropriate diagnosis of diseases in general and blood diseases in particular. Therefore, specialists check blood cells and other blood details. Thus, to diagnose a disease, specialists must analyze the factors of the patient’s blood and medical history. Generally, doctors have tended to use intelligent agents to help them with CBC analysis. However, these agents need analytical tools to extract the parameters (CBC parameters) employed in the prediction of the development of life-threatening bacteremia and offer prognostic data. Therefore, this paper proposes an enhancement to the Rabin–Karp algorithm and then mixes it with the fuzzy ratio to make this algorithm suitable for working with CBC test data. The selection of these algorithms was performed after evaluating the utility of various string matching algorithms in order to choose the best ones to establish an accurate text collection tool to be a baseline for building a general report on patient information. The proposed method includes several basic steps: Firstly, the CBC-driven parameters are extracted using an efficient method for retrieving data information from pdf files or images of the CBC tests. This will be performed by implementing 12 traditional string matching algorithms, then finding the most effective ways based on the implementation results, and, subsequently, introducing a hybrid approach to address the shortcomings or issues in those methods to discover a more effective and faster algorithm to perform the analysis of the pathological tests. The proposed algorithm (Razy) was implemented using the Rabin algorithm and the fuzzy ratio method. The results show that the proposed algorithm is fast and efficient, with an average accuracy of 99.94% when retrieving the results. Moreover, we can conclude that the string matching algorithm is a crucial tool in the report analysis process that directly affects the efficiency of the analytical system.
Hedging is a linguistic avoidance of full commitment or precision. It is the use of a vague language. The main objectives of this study are to
... Show MoreSampling is the selection of a representative portion of a material, and it’s as important as testing. The minimum weight of gravel field or lab sample depends on the nominal maximum particle size. The weight of the sample will always be greater than that portion required for testing. The approximate precision desired for the testing will control the weight of the gravel sample. In this study, gravel sample has been simulated by using multilinear approximated function for Fuller’s curve on the logarithmic scale. Gravel particles are divided into classes according to their medium diameter and each class was simulated separately. A stochastic analysis, by using 100 realizations in s
Praise be to God, Lord of the worlds, and prayers and peace be upon our master Muhammad and upon his family and companions as a whole.
The topic of attention was drawn to the attention of the speakers, as it made me ponder it carefully, and my goal in that is to know the reason for the interest of the speakers and their care for it, and to clarify from their books the purpose of making this topic one of the advanced investigations with them.
The idea of writing a paper on the subject of consideration was not absent from my thinking, because I saw the attention of speakers on this issue, as they made it one of the first issues that they deal with studying in their work
... Show MoreFine aggregate (Sand) is a necessary material used in concrete construction purposes, it’s naturally available and it’s widely used around the world for different parts of construction in any building mainly for filling the voids between gravel. Sand gradation is important for different composite materials, and it gives good cohesion when compared with coarse sand that provides strength for the building. Therefore, sand is necessary to be tested before it is used and mixed with other building materials in construction and the specimen must be selected carefully to represent the real material in the field. The specimen weight must be larger than the required weight for test. When t
The most influential theory of ‘Politeness’ was formulated in 1978 and revised in 1987 by Brown and Levinson. ‘Politeness’, which represents the interlocutors’ desire to be pleasant to each other through a positive manner of addressing, was claimed to be a universal phenomenon. The gist of the theory is the intention to mitigate ‘Face’ threats carried by certain ‘Face’ threatening acts towards others.
‘Politeness Theory’ is based on the concept that interlocutors have ‘Face’ (i.e., self and public – image) which they consciously project, try to protect and to preserve. The theory holds that various politeness strategies are used to prot
... Show MoreThis article studies a comprehensive methods of edge detection and algorithms in digital images which is reflected a basic process in the field of image processing and analysis. The purpose of edge detection technique is discovering the borders that distinct diverse areas of an image, which donates to refining the understanding of the image contents and extracting structural information. The article starts by clarifying the idea of an edge and its importance in image analysis and studying the most noticeable edge detection methods utilized in this field, (e.g. Sobel, Prewitt, and Canny filters), besides other schemes based on distinguishing unexpected modifications in light intensity and color gradation. The research as well discuss
... Show MoreThe demand for electronic -passport photo ( frontal facial) images has grown rapidly. It now extends to Electronic Government (E-Gov) applications such as social benefits driver's license, e-passport, and e-visa . With the COVID 19 (coronavirus disease ), facial (formal) images are becoming more widely used and spreading quickly, and are being used to verify an individual's identity, but unfortunately that comes with insignificant details of constant background which leads to huge byte consumption that affects storage space and transmission, where the optimal solution that aims to curtail data size using compression techniques that based on exploiting image redundancy(s) efficiently.
Emotion could be expressed through unimodal social behaviour’s or bimodal or it could be expressed through multimodal. This survey describes the background of facial emotion recognition and surveys the emotion recognition using visual modality. Some publicly available datasets are covered for performance evaluation. A summary of some of the research efforts to classify emotion using visual modality for the last five years from 2013 to 2018 is given in a tabular form.