Over the years, the field of Medical Imagology has gained considerable importance. The number of neuroimaging studies conducted using functional magnetic resonance imaging (fMRI) has been exploding in recent years. fMRI survey gives to rise to large amounts of noisy data with a complex spatiotemporal correlation structure. Statistics play great role in clarifying the features of the data and gain results that can be used and explain by neuroscientists. Several types of artifacts can happen through a functional magnetic resonance imaging (fMRI) scanner Because of software or hardware problems, physical limitation or human physiologic phenomenon. Several of them can negatively affect diagnostic image goodness, and confused with various pathology. Artificial Characteristics is show in an image not found in the real actual object it is the necessary to recognize these artifacts according to a basic perception of their origin, especially those simulating pathology, as they can be sing to wrong diagnostic medical. As a result, it causes dangerous effects on the patient’s health. We discuss the study of fMRI data in this paper; we highlight important and significant problems where statistics already play a major role. It is include a sequence of programs for processing, analysing, and offer fMRI data. Of special regard to statisticians might be its use of functions from the statistical software package. The conventional methods are FSL and SPM. The most generally applied software is SPM (Statistical Parametric Mapping), which include groups of MATLAB functions for pre-processing, analysing, and display fMRI data.
Longitudinal data is becoming increasingly common, especially in the medical and economic fields, and various methods have been analyzed and developed to analyze this type of data.
In this research, the focus was on compiling and analyzing this data, as cluster analysis plays an important role in identifying and grouping co-expressed subfiles over time and employing them on the nonparametric smoothing cubic B-spline model, which is characterized by providing continuous first and second derivatives, resulting in a smoother curve with fewer abrupt changes in slope. It is also more flexible and can pick up on more complex patterns and fluctuations in the data.
The longitudinal balanced data profile was compiled into subgroup
... Show MoreTwo dimensional meso-scale concrete modeling was used in finite element analysis of plain concrete beam subjected to bending. The plane stress 4-noded quadrilateral elements were utilized to model coarse aggregate, cement mortar. The effect of aggregate fraction distribution, and pores percent of the total area – resulting from air voids entrapped in concrete during placement on the behavior of plain concrete beam in flexural was detected. Aggregate size fractions were randomly distributed across the profile area of the beam. Extended Finite Element Method (XFEM) was employed to treat the discontinuities problems result from double phases of concrete and cracking that faced during the finite element analysis of concrete beam. Crac
... Show MoreThe current research creates an overall relative analysis concerning the estimation of Meixner process parameters via the wavelet packet transform. Of noteworthy presentation relevance, it compares the moment method and the wavelet packet estimator for the four parameters of the Meixner process. In this paper, the research focuses on finding the best threshold value using the square root log and modified square root log methods with the wavelet packets in the presence of noise to enhance the efficiency and effectiveness of the denoising process for the financial asset market signal. In this regard, a simulation study compares the performance of moment estimation and wavelet packets for different sample sizes. The results show that wavelet p
... Show MoreA Modified version of the Generlized standard addition method ( GSAM) was developed. This modified version was used for the quantitative determination of arginine (Arg) and glycine ( Gly) in arginine acetyl salicylate – glycine complex . According to this method two linear equations were solved to obtain the amounts of (Arg) and (Gly). The first equation was obtained by spectrophotometic measurement of the total absorbance of (Arg) and (Gly) colored complex with ninhydrin . The second equation was obtained by measuring the total acid consumed by total amino groups of (Arg) and ( Gly). The titration was carried out in non- aqueous media using perchloric acid in glacial acetic acid as a titrant. The developed metho
... Show MoreBackground: Oocytes are susceptible to alterations in the various fatty acid contents of follicular fluid (FF), which may influence maturation and embryogenesis. Different fatty acids exert various effects on intracytoplasmic sperm injection (ICSI), which needs further studies to uncover the involved mechanisms. Objectives: To assess FF fatty acids in women undergoing ICSI and to correlate them with ICSI parameters, namely the total count of aspirated oocytes, oocyte maturation rate, fertilization rate and percentage of good-quality embryos. Methods: Fifty women undergoing ICSI were enrolled in this cross-sectional study. FF samples were collected during oocyte retrieval and were analyzed for fatty acids using gas chromatography. Fa
... Show MoreAs the process of estimate for model and variable selection significant is a crucial process in the semi-parametric modeling At the beginning of the modeling process often At there are many explanatory variables to Avoid the loss of any explanatory elements may be important as a result , the selection of significant variables become necessary , so the process of variable selection is not intended to simplifying model complexity explanation , and also predicting. In this research was to use some of the semi-parametric methods (LASSO-MAVE , MAVE and The proposal method (Adaptive LASSO-MAVE) for variable selection and estimate semi-parametric single index model (SSIM) at the same time .
... Show MoreAbstract
Metal cutting processes still represent the largest class of manufacturing operations. Turning is the most commonly employed material removal process. This research focuses on analysis of the thermal field of the oblique machining process. Finite element method (FEM) software DEFORM 3D V10.2 was used together with experimental work carried out using infrared image equipment, which include both hardware and software simulations. The thermal experiments are conducted with AA6063-T6, using different tool obliquity, cutting speeds and feed rates. The results show that the temperature relatively decreased when tool obliquity increases at different cutting speeds and feed rates, also it
... Show MoreThe study aims to analyze computer textbooks content for preparatory stage according to the logical thinking. The researcher followed the descriptive analytical research approach (content analysis), and adopted an explicit idea during the analysis process. One of the content analysis tools which was designed based on mental processes employed during logical thinking has utilized to figure out the study results. The findings revealed that logical thinking skills formed (52%) in fourth preparatory textbook and (47%) in fifth preparatory textbook.
BN Rashid