Generally, radiologists analyse the Magnetic Resonance Imaging (MRI) by visual inspection to detect and identify the presence of tumour or abnormal tissue in brain MR images. The huge number of such MR images makes this visual interpretation process, not only laborious and expensive but often erroneous. Furthermore, the human eye and brain sensitivity to elucidate such images gets reduced with the increase of number of cases, especially when only some slices contain information of the affected area. Therefore, an automated system for the analysis and classification of MR images is mandatory. In this paper, we propose a new method for abnormality detection from T1-Weighted MRI of human head scans using three planes, including axial plane, coronal plane, and sagittal plane. Three different thresholds, which are based on texture features: mean, energy and entropy, are obtained automatically. This allowed to accurately separating the MRI slice into normal and abnormal one. However, the abnormality detection contained some normal blocks assigned wrongly as abnormal and vice versa. This problem is surmounted by applying the fine-tuning mechanism. Finally, the MRI slice abnormality detection is achieved by selecting the abnormal slices along its tumour region (Region of Interest-ROI).
In many scientific fields, Bayesian models are commonly used in recent research. This research presents a new Bayesian model for estimating parameters and forecasting using the Gibbs sampler algorithm. Posterior distributions are generated using the inverse gamma distribution and the multivariate normal distribution as prior distributions. The new method was used to investigate and summaries Bayesian statistics' posterior distribution. The theory and derivation of the posterior distribution are explained in detail in this paper. The proposed approach is applied to three simulation datasets of 100, 300, and 500 sample sizes. Also, the procedure was extended to the real dataset called the rock intensity dataset. The actual dataset is collecte
... Show MoreA load-shedding controller suitable for small to medium size loads is designed and implemented based on preprogrammed priorities and power consumption for individual loads. The main controller decides if a particular load can be switched ON or not according to the amount of available power generation, load consumption and loads priorities. When themaximum allowed power consumption is reached and the user want to deliver power to additional load, the controller will decide if this particular load should be denied receiving power if its priority is low. Otherwise, it can be granted to receive power if its priority is high and in this case lower priority loads are automatically switched OFF in order not to overload the power generation. The
... Show MoreThis article aims to provide a bibliometric analysis of intellectual capital research published in the Scopus database from 1956 to 2020 to trace the development of scientific activities that can pave the way for future studies by shedding light on the gaps in the field. The analysis focuses on 638 intellectual capital-related papers published in the Scopus database over 60 years, drawing upon a bibliometric analysis using VOSviewer. This paper highlights the mainstream of the current research in the intellectual capital field, based on the Scopus database, by presenting a detailed bibliometric analysis of the trend and development of intellectual capital research in the past six decades, including journals, authors, countries, inst
... Show MorePorous materials play an important role in creating a sustainable environment by improving wastewater treatment's efficacy. Porous materials, including adsorbents or ion exchangers, catalysts, metal–organic frameworks, composites, carbon materials, and membranes, have widespread applications in treating wastewater and air pollution. This review examines recent developments in porous materials, focusing on their effectiveness for different wastewater pollutants. Specifically, they can treat a wide range of water contaminants, and many remove over 95% of targeted contaminants. Recent advancements include a wider range of adsorption options, heterogeneous catalysis, a new UV/H2O
This research deals with a very important subject as it tries to change the theoretical and scientific heritage and some professional rules adopted in the newsroom. Most media students have difficulties in writing news for press correctly. The researcher tries to identify the compatibility of what is published in local news agencies with professional and academic standards.
The research finds detailed editorial rules for a number of news formats which will play an important role in writing news for press easily, especially for the beginners and newcomers. Also, it discovers a new fact denying the beliefs of some researchers and writers in not having news conclusion in news edited according to the inverted pyramid pattern.
The re
The traditional centralized network management approach presents severe efficiency and scalability limitations in large scale networks. The process of data collection and analysis typically involves huge transfers of management data to the manager which cause considerable network throughput and bottlenecks at the manager side. All these problems processed using the Agent technology as a solution to distribute the management functionality over the network elements. The proposed system consists of the server agent that is working together with clients agents to monitor the logging (off, on) of the clients computers and which user is working on it. file system watcher mechanism is used to indicate any change in files. The results were presente
... Show MoreThe present research aimed to test the imagination of children, and may build sample consisted of (400) a baby and child, selected by random way of four Directorates (first Resafe, second Resafe ,first alkarkh , second alkarkh), in order to achieve the objective of research the tow researchers have a test of imagination and extract the virtual and honesty plants distinguish paragraphs and paragraphs and difficulty factor became the test consists of (32), statistical methods were used (Pearson correlation coefficient, coefficient of difficult passages, highlight paragraphs, correlation equation, an equation wrong Standard) the tow researchers have a number of recommendations and proposals.
The basic solution to overcome difficult issues related to huge size of digital images is to recruited image compression techniques to reduce images size for efficient storage and fast transmission. In this paper, a new scheme of pixel base technique is proposed for grayscale image compression that implicitly utilize hybrid techniques of spatial modelling base technique of minimum residual along with transformed technique of Discrete Wavelet Transform (DWT) that also impels mixed between lossless and lossy techniques to ensure highly performance in terms of compression ratio and quality. The proposed technique has been applied on a set of standard test images and the results obtained are significantly encourage compared with Joint P
... Show MoreObjectives To tailor composites of polyethylene–hydroxyapatite to function as a new intracanal post for the restoration of endodontically treated teeth (ETT). Methods Silanated hydroxyapatite (HA) and zirconium dioxide (ZrO2) filled low-density polyethylene (LDPE) composites were fabricated by a melt extrusion process and characterised using infrared spectroscopy (FTIR), differential scanning calorimetry (DSC) and dynamic mechanical analysis (DMA). The flexural strength and modulus were determined in dry state and post ageing in simulated body fluid and fractured surfaces analysed by SEM. The water uptake and radiographic appearance of the experimental composites were also measured and compared with a commercially known endodontic fibre
... Show More