COVID 19 has spread rapidly around the world due to the lack of a suitable vaccine; therefore the early prediction of those infected with this virus is extremely important attempting to control it by quarantining the infected people and giving them possible medical attention to limit its spread. This work suggests a model for predicting the COVID 19 virus using feature selection techniques. The proposed model consists of three stages which include the preprocessing stage, the features selection stage, and the classification stage. This work uses a data set consists of 8571 records, with forty features for patients from different countries. Two feature selection techniques are used in order to select the best features that affect the prediction of the proposed model. These are the Recursive Feature Elimination (RFE) as wrapper feature selection and the Extra Tree Classifier (ETC) as embedded feature selection. Two classification methods are applied for classifying the features vectors which include the Naïve Bayesian method and Restricted Boltzmann Machine (RBM) method. The results were 56.181%, 97.906% respectively when classifying all features and 66.329%, 99.924% respectively when classifying the best ten features using features selection techniques.
Polycystic ovary syndrome (PCOS) is the main cause of female infertility. The role of insulin resistance in the development of polycystic ovary is actively discussed here. The study included patients with PCOS without insulin resistance (n = 48) and with insulin resistance (n = 39). The comparison groups were patients with no history of PCOS: a control group without insulin resistance (n = 46) and a group of patients with insulin resistance (n = 45). The following parameters were determined in patients: FSH, LH, TSH, T3f, T4f, PRL, E2, 17-OHd, Pr, AMH, Test total, Testf, DHEAS, DHEASs, SHBG, ACTH, cortisol, IRI, IGF-1, C-peptide, and glucose level. The HOMA-IR index and the LH / FSH ratio and t
... Show MoreMM Abdulwahhab, kufa Journal for Nursing sciences, 2017 - Cited by 1
Image compression is a serious issue in computer storage and transmission, that simply makes efficient use of redundancy embedded within an image itself; in addition, it may exploit human vision or perception limitations to reduce the imperceivable information Polynomial coding is a modern image compression technique based on modelling concept to remove the spatial redundancy embedded within the image effectively that composed of two parts, the mathematical model and the residual. In this paper, two stages proposed technqies adopted, that starts by utilizing the lossy predictor model along with multiresolution base and thresholding techniques corresponding to first stage. Latter by incorporating the near lossless com
... Show MoreThis study explores the challenges in Artificial Intelligence (AI) systems in generating image captions, a task that requires effective integration of computer vision and natural language processing techniques. A comparative analysis between traditional approaches such as retrieval- based methods and linguistic templates) and modern approaches based on deep learning such as encoder-decoder models, attention mechanisms, and transformers). Theoretical results show that modern models perform better for the accuracy and the ability to generate more complex descriptions, while traditional methods outperform speed and simplicity. The paper proposes a hybrid framework that combines the advantages of both approaches, where conventional methods prod
... Show MoreThe concept of narration has taken an aesthetic field farther than the primitive human act which was imposed by the necessities of social communication in an ancient historical period. The research addressed the research problem. The importance of the research lies in connecting the concept of narration with the theatre directing elements. The research aims at discovering the narration fields in the theatre directing represented by the perceived videos, audios and motions. The research time limit was (2014). The theoretical framework is divided into three chapters:
The first chapter (the concept of narration in literature and criticism), the second addressed
... Show MoreCohesion is well known as the study of the relationships, whether grammatical and/or lexical, between the different elements of a particular text by the use of what are commonly called 'cohesive devices'. These devices bring connectivity and bind a text together. Besides, the nature and the amount of such cohesive devices usually affect the understanding of that text in the sense of making it easier to comprehend. The present study is intendedto examine the use of grammatical cohesive devicesin relation to narrative techniques. The story of Joseph from the Holy Quran has been selected to be examined by using Halliday and Hasan's Model of Cohesion (1976, 1989). The aim of the study is to comparatively examine to what extent the type
... Show MoreFor the most reliable and reproducible results for calibration or general testing purposes of two immiscible liquids, such as water in engine oil, good emulsification is vital. This study explores the impact of emulsion quality on the Fourier transform infrared (FT-IR) spectroscopy calibration standards for measuring water contamination in used or in-service engine oil, in an attempt to strengthen the specific guidelines of ASTM International standards for sample preparation. By using different emulsification techniques and readily available laboratory equipment, this work is an attempt to establish the ideal sample preparation technique for reliability, repeatability, and reproducibility for FT-IR analysis while still considering t
... Show MoreIn regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement
... Show MoreBackground: This study aimed to evaluate the outcome of long-term results of dacryocystorhinostomy (DCR) techniques in specialized eye care center in Iraq.
Subjects and Method: This is a prospective study of 650 patients from July 2014 to July 2019 with nasolacrimal duct obstruction in Ibn Al Haitham Eye Teaching Hospital. A preoperative questionnaire was done, then one month, three months, six months and one year postoperatively. The success of surgery defined as follow; Absence of epiphora completely, Resolve of dacryocele or mucocele or any new attack of daryocystitis, Appearance of fluorescein dye from nose in fluorescein disappearance test, Successful irriga
... Show MoreTo date, comprehensive reviews and discussions of the strengths and limitations of Remote Sensing (RS) standalone and combination approaches, and Deep Learning (DL)-based RS datasets in archaeology have been limited. The objective of this paper is, therefore, to review and critically discuss existing studies that have applied these advanced approaches in archaeology, with a specific focus on digital preservation and object detection. RS standalone approaches including range-based and image-based modelling (e.g., laser scanning and SfM photogrammetry) have several disadvantages in terms of spatial resolution, penetrations, textures, colours, and accuracy. These limitations have led some archaeological studies to fuse/integrate multip
... Show More