Generally, radiologists analyse the Magnetic Resonance Imaging (MRI) by visual inspection to detect and identify the presence of tumour or abnormal tissue in brain MR images. The huge number of such MR images makes this visual interpretation process, not only laborious and expensive but often erroneous. Furthermore, the human eye and brain sensitivity to elucidate such images gets reduced with the increase of number of cases, especially when only some slices contain information of the affected area. Therefore, an automated system for the analysis and classification of MR images is mandatory. In this paper, we propose a new method for abnormality detection from T1-Weighted MRI of human head scans using three planes, including axial plane, coronal plane, and sagittal plane. Three different thresholds, which are based on texture features: mean, energy and entropy, are obtained automatically. This allowed to accurately separating the MRI slice into normal and abnormal one. However, the abnormality detection contained some normal blocks assigned wrongly as abnormal and vice versa. This problem is surmounted by applying the fine-tuning mechanism. Finally, the MRI slice abnormality detection is achieved by selecting the abnormal slices along its tumour region (Region of Interest-ROI).
The aerodynamic characteristics of general three-dimensional rectangular wings are considered using non-linear interaction between two-dimensional viscous-inviscid panel method and vortex ring method. The potential flow of a two-dimensional airfoil by the pioneering Hess & Smith method was used with viscous laminar, transition and turbulent boundary layer to solve flow about complex configuration of airfoils including stalling effect. Viterna method was used to extend the aerodynamic characteristics of the specified airfoil to high angles of attacks. A modified vortex ring method was used to find the circulation values along span wise direction of the wing and then interacted with sectional circulation obtained by Kutta-Joukowsky the
... Show MoreNumeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show MoreCarbon-fiber-reinforced polymer (CFRP) is widely acknowledged as a leading advanced material structure, offering superior properties compared to traditional materials, and has found diverse applications in several industrial sectors, such as that of automobiles, aircrafts, and power plants. However, the production of CFRP composites is prone to fabrication problems, leading to structural defects arising from cycling and aging processes. Identifying these defects at an early stage is crucial to prevent service issues that could result in catastrophic failures. Hence, routine inspection and maintenance are crucial to prevent system collapse. To achieve this objective, conventional nondestructive testing (NDT) methods are utilized to i
... Show MoreIn many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreThe basic solution to overcome difficult issues related to huge size of digital images is to recruited image compression techniques to reduce images size for efficient storage and fast transmission. In this paper, a new scheme of pixel base technique is proposed for grayscale image compression that implicitly utilize hybrid techniques of spatial modelling base technique of minimum residual along with transformed technique of Discrete Wavelet Transform (DWT) that also impels mixed between lossless and lossy techniques to ensure highly performance in terms of compression ratio and quality. The proposed technique has been applied on a set of standard test images and the results obtained are significantly encourage compared with Joint P
... Show MoreThe traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreThe present research aimed to test the imagination of children, and may build sample consisted of (400) a baby and child, selected by random way of four Directorates (first Resafe, second Resafe ,first alkarkh , second alkarkh), in order to achieve the objective of research the tow researchers have a test of imagination and extract the virtual and honesty plants distinguish paragraphs and paragraphs and difficulty factor became the test consists of (32), statistical methods were used (Pearson correlation coefficient, coefficient of difficult passages, highlight paragraphs, correlation equation, an equation wrong Standard) the tow researchers have a number of recommendations and proposals.
This research deals with a very important subject as it tries to change the theoretical and scientific heritage and some professional rules adopted in the newsroom. Most media students have difficulties in writing news for press correctly. The researcher tries to identify the compatibility of what is published in local news agencies with professional and academic standards.
The research finds detailed editorial rules for a number of news formats which will play an important role in writing news for press easily, especially for the beginners and newcomers. Also, it discovers a new fact denying the beliefs of some researchers and writers in not having news conclusion in news edited according to the inverted pyramid pattern.
The re