Purpose – The Cloud computing (CC) and its services have enabled the information centers of organizations to adapt their informatic and technological infrastructure and making it more appropriate to develop flexible information systems in the light of responding to the informational and knowledge needs of their users. In this context, cloud-data governance has become more complex and dynamic, requiring an in-depth understanding of the data management strategy at these centers in terms of: organizational structure and regulations, people, technology, process, roles and responsibilities. Therefore, our paper discusses these dimensions as challenges that facing information centers in according to their data governance and the impa
... Show MoreThis research deals with the most important indicators used to measure the phenomenon of financial depth, beyond the traditional indicators, which are called quantitative indicators, which is shown to be inadequate to show the facts accurately, but it may come in the results of a counterfactual, although reliable in econometric studies done in this regard.
Therefore, this research has sought to put forward alternative indicators, is the structural indicators, and financial prices, and availability of financial instruments, and cost of transactions concluded, in order to measure the phenomenon of financial depth.
After using and analyzing data collected from countries the research
... Show MoreThe objective of this paper is to show modern class of open sets which is an -open. Some functions via this concept were studied and the relationships such as continuous function strongly -continuous function -irresolute function -continuous function.
The aim of this paper is to design fast neural networks to approximate periodic functions, that is, design a fully connected networks contains links between all nodes in adjacent layers which can speed up the approximation times, reduce approximation failures, and increase possibility of obtaining the globally optimal approximation. We training suggested network by Levenberg-Marquardt training algorithm then speeding suggested networks by choosing most activation function (transfer function) which having a very fast convergence rate for reasonable size networks. In all algorithms, the gradient of the performance function (energy function) is used to determine how to
... Show More In this paper we show that the function , () p fLI α ∈ ,0<p<1 where I=[-1,1] can be approximated by an algebraic polynomial with an error not exceeding , 1 ( , , ) kp ft n ϕ αω where
,
1 ( , , ) kp ft n ϕ αω is the Ditizian–Totik modules of smoothness of unbounded function in , () p LI
Objective: This study aimed to investigate the relationship between high blood pressure and
different variables, such as (weight, smoking, amount of salt and water taken daily, and number
of hours of natural sleep per person) for young people.
Methodology: The study was conducted on students at the student community of the Technical
Institute in Baquba and the University of Diyala during the period from September 2015
until June 2016. The patients ranged in age from 18-24 years. All data were collected through
a questionnaire that included the main reasons and periodic follow-up of the disease.
Results: The total number of samples was 450.The results showed that 33% of all samples
have high blood pressure. The rel
Permeability estimation is a vital step in reservoir engineering due to its effect on reservoir's characterization, planning for perforations, and economic efficiency of the reservoirs. The core and well-logging data are the main sources of permeability measuring and calculating respectively. There are multiple methods to predict permeability such as classic, empirical, and geostatistical methods. In this research, two statistical approaches have been applied and compared for permeability prediction: Multiple Linear Regression and Random Forest, given the (M) reservoir interval in the (BH) Oil Field in the northern part of Iraq. The dataset was separated into two subsets: Training and Testing in order to cross-validate the accuracy
... Show MoreSpatial data analysis is performed in order to remove the skewness, a measure of the asymmetry of the probablitiy distribution. It also improve the normality, a key concept of statistics from the concept of normal distribution “bell shape”, of the properties like improving the normality porosity, permeability and saturation which can be are visualized by using histograms. Three steps of spatial analysis are involved here; exploratory data analysis, variogram analysis and finally distributing the properties by using geostatistical algorithms for the properties. Mishrif Formation (unit MB1) in Nasiriya Oil Field was chosen to analyze and model the data for the first eight wells. The field is an anticline structure with northwest- south
... Show MoreIn this paper,we estimate the parameters and related probability functions, survival function, cumulative distribution function , hazard function(failure rate) and failure (death) probability function(pdf) for two parameters Birnbaum-Saunders distribution which is fitting the complete data for the patients of lymph glands cancer. Estimating the parameters (shape and scale) using (maximum likelihood , regression quantile and shrinkage) methods and then compute the value of mentioned related probability functions depending on sample from real data which describe the duration of survivor for patients who suffer from the lymph glands cancer based on diagnosis of disease or the inter of patients in a hospital for perio
... Show More