The meniscus has a crucial function in human anatomy, and Magnetic Resonance Imaging (M.R.I.) plays an essential role in meniscus assessment. It is difficult to identify cartilage lesions using typical image processing approaches because the M.R.I. data is so diverse. An M.R.I. data sequence comprises numerous images, and the attributes area we are searching for may differ from each image in the series. Therefore, feature extraction gets more complicated, hence specifically, traditional image processing becomes very complex. In traditional image processing, a human tells a computer what should be there, but a deep learning (D.L.) algorithm extracts the features of what is already there automatically. The surface changes become valuable when diagnosing a tissue sample. Small, unnoticeable changes in pixel density may indicate the beginning of cancer or tear tissue in the early stages. These details even expert pathologists might miss. Artificial intelligence (A.I.) and D.L. revolutionized radiology by enhancing efficiency and accuracy of both interpretative and non-interpretive jobs. When you look at AI applications, you should think about how they might work. Convolutional Neural Network (C.N.N.) is a part of D.L. that can be used to diagnose knee problems. There are existing algorithms that can detect and categorize cartilage lesions, meniscus tears on M.R.I., offer an automated quantitative evaluation of healing, and forecast who is most likely to have recurring meniscus tears based on radiographs.
In recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the
... Show MoreThis paper focuses on the optimization of drilling parameters by utilizing “Taguchi method” to obtain the minimum surface roughness. Nine drilling experiments were performed on Al 5050 alloy using high speed steel twist drills. Three drilling parameters (feed rates, cutting speeds, and cutting tools) were used as control factors, and L9 (33) “orthogonal array” was specified for the experimental trials. Signal to Noise (S/N) Ratio and “Analysis of Variance” (ANOVA) were utilized to set the optimum control factors which minimized the surface roughness. The results were tested with the aid of statistical software package MINITAB-17. After the experimental trails, the tool diameter was found as the most important facto
... Show MoreBiometrics represent the most practical method for swiftly and reliably verifying and identifying individuals based on their unique biological traits. This study addresses the increasing demand for dependable biometric identification systems by introducing an efficient approach to automatically recognize ear patterns using Convolutional Neural Networks (CNNs). Despite the widespread adoption of facial recognition technologies, the distinct features and consistency inherent in ear patterns provide a compelling alternative for biometric applications. Employing CNNs in our research automates the identification process, enhancing accuracy and adaptability across various ear shapes and orientations. The ear, being visible and easily captured in
... Show MoreFace recognition is required in various applications, and major progress has been witnessed in this area. Many face recognition algorithms have been proposed thus far; however, achieving high recognition accuracy and low execution time remains a challenge. In this work, a new scheme for face recognition is presented using hybrid orthogonal polynomials to extract features. The embedded image kernel technique is used to decrease the complexity of feature extraction, then a support vector machine is adopted to classify these features. Moreover, a fast-overlapping block processing algorithm for feature extraction is used to reduce the computation time. Extensive evaluation of the proposed method was carried out on two different face ima
... Show MoreVolunteerism is an element included in many human cultures. It represents a positive cooperative act between individuals and groups. It expresses the social value systems. As a social phenomenon, it develops in societies according to innumerous circumstances and conditions. This study uses a functional approach that assumes that volunteering performs six functions for volunteers. Namely, we assume that volunteering (1) creates a sense of protection (2) meets significant cultural values (3) improves professional status of volunteers, (4) strengthens their social relationships, (5) helps them achieve a better understanding of life, and finally, (6) enhances their outlook and self-esteem. The central aim of the study is to discuss these fun
... Show MoreThe present study aims to detect CTX-M-type ESBL from Escherichia coli clinical isolates and to analyze their antibotic susceptibility patterns. One hundred of E. coli isolates were collected from different clinical samples from a tertiary hospital. ESBL positivity was determined by the disk diffusion method. PCR used for amplification of CTX-M-type ESBL produced by E. coli. Out of 100 E. coli isolates, twenty-four isolates (24%) were ESBL-producers. E. coli isolated from pus was the most frequent clinical specimen that produced ESBL (41.66%) followed by urine (34.21%), respiratory (22.23%), and blood (19.05%). After PCR amplification of these 24 isolates, 10 (41.66%) isolates were found to possess CTX-M genes. The CTX-M type ESBL
... Show MoreIn today's world, the science of bioinformatics is developing rapidly, especially with regard to the analysis and study of biological networks. Scientists have used various nature-inspired algorithms to find protein complexes in protein-protein interaction (PPI) networks. These networks help scientists guess the molecular function of unknown proteins and show how cells work regularly. It is very common in PPI networks for a protein to participate in multiple functions and belong to many complexes, and as a result, complexes may overlap in the PPI networks. However, developing an efficient and reliable method to address the problem of detecting overlapping protein complexes remains a challenge since it is considered a complex and har
... Show Moretock markets changed up and down during time. Some companies’ affect others due to dependency on each other . In this work, the network model of the stock market is discribed as a complete weighted graph. This paper aims to investigate the Iraqi stock markets using graph theory tools. The vertices of this graph correspond to the Iraqi markets companies, and the weights of the edges are set ulrametric distance of minimum spanning tree.