The meniscus has a crucial function in human anatomy, and Magnetic Resonance Imaging (M.R.I.) plays an essential role in meniscus assessment. It is difficult to identify cartilage lesions using typical image processing approaches because the M.R.I. data is so diverse. An M.R.I. data sequence comprises numerous images, and the attributes area we are searching for may differ from each image in the series. Therefore, feature extraction gets more complicated, hence specifically, traditional image processing becomes very complex. In traditional image processing, a human tells a computer what should be there, but a deep learning (D.L.) algorithm extracts the features of what is already there automatically. The surface changes become valuable when diagnosing a tissue sample. Small, unnoticeable changes in pixel density may indicate the beginning of cancer or tear tissue in the early stages. These details even expert pathologists might miss. Artificial intelligence (A.I.) and D.L. revolutionized radiology by enhancing efficiency and accuracy of both interpretative and non-interpretive jobs. When you look at AI applications, you should think about how they might work. Convolutional Neural Network (C.N.N.) is a part of D.L. that can be used to diagnose knee problems. There are existing algorithms that can detect and categorize cartilage lesions, meniscus tears on M.R.I., offer an automated quantitative evaluation of healing, and forecast who is most likely to have recurring meniscus tears based on radiographs.
The research aimed to identify “The impact of an instructional-learning design based on the brain- compatible model in systemic thinking among first intermediate grade female students in Mathematics”, in the day schools of the second Karkh Educational directorate.In order to achieve the research objective, the following null hypothesis was formulated:There is no statistically significant difference at the significance level (0.05) among the average scores of the experimental group students who will be taught by applying an (instructional- learning) design based to on the brain–compatible model and the average scores of the control group students who will be taught through the traditional method in the systemic thinking test.The resear
... Show MoreThe chemical bath deposition technique (CBD) is considered the cheapest and easiest compared with other deposition techniques. However, it is highly sensitive to effective parameter deposition values such as pH, temperature, and so on. The pH value of the reaction solution has a direct impact on both the nucleation and growth rate of the film. Consequently, this study presents a novel investigation into the effect of a precise change. in the pH reaction solution value on the structural, morphological, and photoresponse characteristics of tin monosulphide (SnS) films. The films were grown on a flexible polyester substrate with pH values of 7.1, 7.4, and 7.7. The X-ray diffraction patterns of the grown films at pH 7.1 and 7.4 confirmed
... Show MoreModern communication and media technology has pioneered new horizons and curried out deep changes in the various fields of social life, It effected enormously human communication as well.
Content one Who late the developments which have effected the social relations ،due to the new media ،especially Face book ,will certainly notice the far cry changes of the social relation net which has been effected ,in a way or another ،the accelerated development ،under the appearance of the so called the virtual society .
Face book has embodied the means – communication ,which has become an important turn point in the social communication .
It is the point the present paper tries to expose an discuss by a field study curried on a sam
A security system can be defined as a method of providing a form of protection to any type of data. A sequential process must be performed in most of the security systems in order to achieve good protection. Authentication can be defined as a part of such sequential processes, which is utilized in order to verify the user permission to entree and utilize the system. There are several kinds of methods utilized, including knowledge, and biometric features. The electroencephalograph (EEG) signal is one of the most widely signal used in the bioinformatics field. EEG has five major wave patterns, which are Delta, Theta, Alpha, Beta and Gamma. Every wave has five features which are amplitude, wavelength, period, speed and frequency. The linear
... Show MoreA frequently used approach for denoising is the shrinkage of coefficients of the noisy signal representation in a transform domain. This paper proposes an algorithm based on hybrid transform (stationary wavelet transform proceeding by slantlet transform); The slantlet transform is applied to the approximation subband of the stationary wavelet transform. BlockShrink thresholding technique is applied to the hybrid transform coefficients. This technique can decide the optimal block size and thresholding for every wavelet subband by risk estimate (SURE). The proposed algorithm was executed by using MATLAB R2010aminimizing Stein’s unbiased with natural images contaminated by white Gaussian noise. Numerical results show that our algorithm co
... Show MoreImage recognition is one of the most important applications of information processing, in this paper; a comparison between 3-level techniques based image recognition has been achieved, using discrete wavelet (DWT) and stationary wavelet transforms (SWT), stationary-stationary-stationary (sss), stationary-stationary-wavelet (ssw), stationary-wavelet-stationary (sws), stationary-wavelet-wavelet (sww), wavelet-stationary- stationary (wss), wavelet-stationary-wavelet (wsw), wavelet-wavelet-stationary (wws) and wavelet-wavelet-wavelet (www). A comparison between these techniques has been implemented. according to the peak signal to noise ratio (PSNR), root mean square error (RMSE), compression ratio (CR) and the coding noise e (n) of each third
... Show MoreAdvances in digital technology and the World Wide Web has led to the increase of digital documents that are used for various purposes such as publishing and digital library. This phenomenon raises awareness for the requirement of effective techniques that can help during the search and retrieval of text. One of the most needed tasks is clustering, which categorizes documents automatically into meaningful groups. Clustering is an important task in data mining and machine learning. The accuracy of clustering depends tightly on the selection of the text representation method. Traditional methods of text representation model documents as bags of words using term-frequency index document frequency (TFIDF). This method ignores the relationship an
... Show More