This research aims to study the methods of reduction of dimensions that overcome the problem curse of dimensionality when traditional methods fail to provide a good estimation of the parameters So this problem must be dealt with directly . Two methods were used to solve the problem of high dimensional data, The first method is the non-classical method Slice inverse regression ( SIR ) method and the proposed weight standard Sir (WSIR) method and principal components (PCA) which is the general method used in reducing dimensions, (SIR ) and (PCA) is based on the work of linear combinations of a subset of the original explanatory variables, which may suffer from the problem of heterogeneity and the problem of linear multiplicity between most explanatory variables. These new combinations of linear compounds resulting from the two methods will reduce the number of explanatory variables to reach a new dimension one or more which called the effective dimension. The mean root of the error squares will be used to compare the two methods to show the preference of methods and a simulation study was conducted to compare the methods used. Simulation results showed that the proposed weight standard Sir method is the best.
Steganography is a technique of concealing secret data within other quotidian files of the same or different types. Hiding data has been essential to digital information security. This work aims to design a stego method that can effectively hide a message inside the images of the video file. In this work, a video steganography model has been proposed through training a model to hiding video (or images) within another video using convolutional neural networks (CNN). By using a CNN in this approach, two main goals can be achieved for any steganographic methods which are, increasing security (hardness to observed and broken by used steganalysis program), this was achieved in this work as the weights and architecture are randomized. Thus,
... Show MoreThe corrosion behavior of carbon steel at different temperatures 100,120,140 and 160 Cͦ under different pressures 7,10 and 13 bar in pure distilled water and after adding three types of oxygen scavengers Hydroquinone, Ascorbic acid and Monoethanolamine in different concentrations 40,60 and 80 ppm has been investigated using weight loss method. The carbon steel specimens were immersed in water containing 8.2 ppm dissolved oxygen (DO) by using autoclave. It was found that corrosion behavior of carbon steel was greatly influenced by temperature with high pressure. The corrosion rate decreases, when adding any one of oxygen scavengers. The best results were obtained at a concentration of 80 ppm of each scavenger. It was observed that
... Show MoreIn this research we study a variance component model, Which is the one of the most important models widely used in the analysis of the data, this model is one type of a multilevel models, and it is considered as linear models , there are three types of linear variance component models ,Fixed effect of linear variance component model, Random effect of linear variance component model and Mixed effect of linear variance component model . In this paper we will examine the model of mixed effect of linear variance component model with one –way random effect ,and the mixed model is a mixture of fixed effect and random effect in the same model, where it contains the parameter (μ) and treatment effect (τi ) which has
... Show MoreA large number of researchers had attempted to identify the pattern of the functional relationship between fertility from a side and economic and social characteristics of the population from another, with the strength of effect of each. So, this research aims to monitor and analyze changes in the level of fertility temporally and spatially in recent decades, in addition to estimating fertility levels in Iraq for the period (1977-2011) and then make forecasting to the level of fertility in Iraq at the national level (except for the Kurdistan region), and for the period of (2012-2031). To achieve this goal has been the use of the Lee-Carter model to estimate fertility rates and predictable as well. As this is the form often has been familiar
... Show MoreIn this paper, we will provide a proposed method to estimate missing values for the Explanatory variables for Non-Parametric Multiple Regression Model and compare it with the Imputation Arithmetic mean Method, The basis of the idea of this method was based on how to employ the causal relationship between the variables in finding an efficient estimate of the missing value, we rely on the use of the Kernel estimate by Nadaraya – Watson Estimator , and on Least Squared Cross Validation (LSCV) to estimate the Bandwidth, and we use the simulation study to compare between the two methods.
The problem of Bi-level programming is to reduce or maximize the function of the target by having another target function within the constraints. This problem has received a great deal of attention in the programming community due to the proliferation of applications and the use of evolutionary algorithms in addressing this kind of problem. Two non-linear bi-level programming methods are used in this paper. The goal is to achieve the optimal solution through the simulation method using the Monte Carlo method using different small and large sample sizes. The research reached the Branch Bound algorithm was preferred in solving the problem of non-linear two-level programming this is because the results were better.
Drilling fluid loss during drilling operation is undesirable, expensive and potentially hazardous problem.
Nasiriyah oil field is one of the Iraqi oil field that suffer from lost circulation problem. It is known that Dammam, um-Radoma, Tayarat, Shiranish and Hartha are the detecting layers of loss circulation problem. Different type of loss circulation materials (LCMs) ranging from granular, flakes and fibrous were used previously to treat this problem.
This study presents the application of rice as a lost circulation material that used to mitigate and stop the loss problem when partial or total losses occurred.
The experim
... Show MoreThe main problem when dealing with fuzzy data variables is that it cannot be formed by a model that represents the data through the method of Fuzzy Least Squares Estimator (FLSE) which gives false estimates of the invalidity of the method in the case of the existence of the problem of multicollinearity. To overcome this problem, the Fuzzy Bridge Regression Estimator (FBRE) Method was relied upon to estimate a fuzzy linear regression model by triangular fuzzy numbers. Moreover, the detection of the problem of multicollinearity in the fuzzy data can be done by using Variance Inflation Factor when the inputs variable of the model crisp, output variable, and parameters are fuzzed. The results were compared usin
... Show MoreAbstract:
Research Topic: Ruling on the sale of big data
Its objectives: a statement of what it is, importance, source and governance.
The methodology of the curriculum is inductive, comparative and critical
One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it
Recommendation: Follow-up of studies dealing with the provisions of the issue
Subject Terms
Judgment, Sale, Data, Mega, Sayings, Jurists