Medical images play a crucial role in the classification of various diseases and conditions. One of the imaging modalities is X-rays which provide valuable visual information that helps in the identification and characterization of various medical conditions. Chest radiograph (CXR) images have long been used to examine and monitor numerous lung disorders, such as tuberculosis, pneumonia, atelectasis, and hernia. COVID-19 detection can be accomplished using CXR images as well. COVID-19, a virus that causes infections in the lungs and the airways of the upper respiratory tract, was first discovered in 2019 in Wuhan Province, China, and has since been thought to cause substantial airway damage, badly impacting the lungs of affected persons. The virus was swiftly gone viral around the world and a lot of fatalities and cases growing were recorded on a daily basis. CXR can be used to monitor the effects of COVID-19 on lung tissue. This study examines a comparison analysis of k-nearest neighbors (KNN), Extreme Gradient Boosting (XGboost), and Support-Vector Machine (SVM) are some classification approaches for feature selection in this domain using The Moth-Flame Optimization algorithm (MFO), The Grey Wolf Optimizer algorithm (GWO), and The Glowworm Swarm Optimization algorithm (GSO). For this study, researchers employed a data set consisting of two sets as follows: 9,544 2D X-ray images, which were classified into two sets utilizing validated tests: 5,500 images of healthy lungs and 4,044 images of lungs with COVID-19. The second set includes 800 images, 400 of healthy lungs and 400 of lungs affected with COVID-19. Each image has been resized to 200x200 pixels. Precision, recall, and the F1-score were among the quantitative evaluation criteria used in this study.
Addressed the problem of the research is marked (experimentation in caves fee) concept and its role in experimentation deviate Display Num formal charges caves. The search came in four sections: general framework for research and identified the research problem and the need for him. With an indication of the importance of his presence. Then specify the search for the goals of (revealed the nature and role of experimentation in determining the nature of Manifesting fee documented on the walls of caves), followed by the establishment of the three search limits (objectivity, the temporal and spatial) were then determine the terms related to the title. Then provide the theoretical framework and indicators that resulted from academic theorizi
... Show MoreThe analysis of the classic principal components are sensitive to the outliers where they are calculated from the characteristic values and characteristic vectors of correlation matrix or variance Non-Robust, which yields an incorrect results in the case of these data contains the outliers values. In order to treat this problem, we resort to use the robust methods where there are many robust methods Will be touched to some of them.
The robust measurement estimators include the measurement of direct robust estimators for characteristic values by using characteristic vectors without relying on robust estimators for the variance and covariance matrices. Also the analysis of the princ
... Show MoreAs the process of estimate for model and variable selection significant is a crucial process in the semi-parametric modeling At the beginning of the modeling process often At there are many explanatory variables to Avoid the loss of any explanatory elements may be important as a result , the selection of significant variables become necessary , so the process of variable selection is not intended to simplifying model complexity explanation , and also predicting. In this research was to use some of the semi-parametric methods (LASSO-MAVE , MAVE and The proposal method (Adaptive LASSO-MAVE) for variable selection and estimate semi-parametric single index model (SSIM) at the same time .
... Show MoreThe present study investigates the use of intensifiers as linguisticdevices employed by Charles Dickens in Hard Times. For ease of analysis, the data are obtained by a rigorous observation of spontaneously occurring intensifiers in the text. The study aims at exploring the pragmatic functions and aesthetic impact of using intensifiers in Hard Times.The current study is mainly descriptive analytical and is based on analyzing and interpreting the use of intensifiers in terms ofHolmes (1984) andCacchiani’smodel (2009). From the findings, the novelist overuses intensifiers to the extent that 280 intensifiers are used in the text. These intensifiers(218) are undistinguished
... Show MoreThis research deals with the study of the relationship between the success factors as the independent variable and product strategies as the dependent variable , has reacted to these variables to form the frame , which is the research which centered research problem about the extent to which industrial companies the vision and knowledge of Muslim women survive and develop in the business market , which can be expressed about the extent of awareness of corporate success factors and the use of product strategies and what the relationship between the factors and strategies , while expressing the importance of research to make the focus on the product occupies a paramount importance in the industrial sector companies in relation to t
... Show MoreImage classification is the process of finding common features in images from various classes and applying them to categorize and label them. The main problem of the image classification process is the abundance of images, the high complexity of the data, and the shortage of labeled data, presenting the key obstacles in image classification. The cornerstone of image classification is evaluating the convolutional features retrieved from deep learning models and training them with machine learning classifiers. This study proposes a new approach of “hybrid learning” by combining deep learning with machine learning for image classification based on convolutional feature extraction using the VGG-16 deep learning model and seven class
... Show MoreTraffic classification is referred to as the task of categorizing traffic flows into application-aware classes such as chats, streaming, VoIP, etc. Most systems of network traffic identification are based on features. These features may be static signatures, port numbers, statistical characteristics, and so on. Current methods of data flow classification are effective, they still lack new inventive approaches to meet the needs of vital points such as real-time traffic classification, low power consumption, ), Central Processing Unit (CPU) utilization, etc. Our novel Fast Deep Packet Header Inspection (FDPHI) traffic classification proposal employs 1 Dimension Convolution Neural Network (1D-CNN) to automatically learn more representational c
... Show MoreRecording an Electromyogram (EMG) signal is essential for diagnostic procedures like muscle health assessment and motor neurons control. The EMG signals have been used as a source of control for powered prosthetics to support people to accomplish their activities of daily living (ADLs). This work deals with studying different types of hand grips and finding their relationship with EMG activity. Five subjects carried out four functional movements (fine pinch, tripod grip and grip with the middle and thumb finger, as well as the power grip). Hand dynamometer has been used to record the EMG activity from three muscles namely; Flexor Carpi Radialis (FCR), Flexor Digitorum Superficialis (FDS), and Abductor Pollicis Brevis (ABP) with different
... Show MoreA new modified differential evolution algorithm DE-BEA, is proposed to improve the reliability of the standard DE/current-to-rand/1/bin by implementing a new mutation scheme inspired by the bacterial evolutionary algorithm (BEA). The crossover and the selection schemes of the DE method are also modified to fit the new DE-BEA mechanism. The new scheme diversifies the population by applying to all the individuals a segment based scheme that generates multiple copies (clones) from each individual one-by-one and applies the BEA segment-wise mechanism. These new steps are embedded in the DE/current-to-rand/bin scheme. The performance of the new algorithm has been compared with several DE variants over eighteen benchmark functions including sever
... Show MoreAddressed the problem of the research is marked: (Performing processors for the time between Impressionism and superrealism) the concept of time and how to submit artwork. The search came in four sections: general framework for research and identified the research problem and the need for him. With an indication of the importance of his presence. Then determine the research objectives of (detection processors performing to the concept of time in works of art in each of Impressionism and superrealism. And a comparison between them to reveal similarities and differences), followed by the establishment of boundaries Find three (objectivity, the temporal and spatial) were then determine the terms related to the title. Then provide the theore
... Show More