In this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the base and obtaining several values for the suggested threshold and applying then Haar wavelet function With the cut-off hard and mid threshold and Comparing the results according to several criteria.
In many oil-recovery systems, relative permeabilities (kr) are essential flow factors that affect fluid dispersion and output from petroleum resources. Traditionally, taking rock samples from the reservoir and performing suitable laboratory studies is required to get these crucial reservoir properties. Despite the fact that kr is a function of fluid saturation, it is now well established that pore shape and distribution, absolute permeability, wettability, interfacial tension (IFT), and saturation history all influence kr values. These rock/fluid characteristics vary greatly from one reservoir region to the next, and it would be impossible to make kr measurements in all of them. The unsteady-state approach was used to calculate the relat
... Show MoreThe current study includes preparing a geometric proposal of the main parameters that must be worked within a seismic reflection survey to prepare a three-dimensional subsurface image. This image represents the Siba oil field located in Basra, southern Iraq. The results were based on two options for selecting these approved elements to create a three-dimensional image of the Mishrif, Zubair and Yamama formations as well as the Jurassic and Permian Khuff and the pre-Khuff reservoir area. The first option is represented in the geometry in option -1 is 12 lines, 6 shots, and 216 chs. The receiver density is 66.67 receivers / km2, so the shot density is the same. Total shots are 21000, which is the same number of receiv
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Evaporation is one of the major components of the hydrological cycle in the nature, thus its accurate estimation is so important in the planning and management of the irrigation practices and to assess water availability and requirements. The aim of this study is to investigate the ability of fuzzy inference system for estimating monthly pan evaporation form meteorological data. The study has been carried out depending on 261 monthly measurements of each of temperature (T), relative humidity (RH), and wind speed (W) which have been available in Emara meteorological station, southern Iraq. Three different fuzzy models comprising various combinations of monthly climatic variables (temperature, wind speed, and relative humidity) were developed
... Show MoreCloud storage provides scalable and low cost resources featuring economies of scale based on cross-user architecture. As the amount of data outsourced grows explosively, data deduplication, a technique that eliminates data redundancy, becomes essential. The most important cloud service is data storage. In order to protect the privacy of data owner, data are stored in cloud in an encrypted form. However, encrypted data introduce new challenges for cloud data deduplication, which becomes crucial for data storage. Traditional deduplication schemes cannot work on encrypted data. Existing solutions of encrypted data deduplication suffer from security weakness. This paper proposes a combined compressive sensing and video deduplication to maximize
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
The distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.
Cryptographic applications demand much more of a pseudo-random-sequence
generator than do most other applications. Cryptographic randomness does not mean just
statistical randomness, although that is part of it. For a sequence to be cryptographically
secure pseudo-random, it must be unpredictable.
The random sequences should satisfy the basic randomness postulates; one of them is
the run postulate (sequences of the same bit). These sequences should have about the same
number of ones and zeros, about half the runs should be of length one, one quarter of length
two, one eighth of length three, and so on.The distribution of run lengths for zeros and ones
should be the same. These properties can be measured determinis
Abstract:
Research Topic: Ruling on the sale of big data
Its objectives: a statement of what it is, importance, source and governance.
The methodology of the curriculum is inductive, comparative and critical
One of the most important results: it is not permissible to attack it and it is a valuable money, and it is permissible to sell big data as long as it does not contain data to users who are not satisfied with selling it
Recommendation: Follow-up of studies dealing with the provisions of the issue
Subject Terms
Judgment, Sale, Data, Mega, Sayings, Jurists
Abstract
The current research aims to examine the effectiveness of a training program for children with autism and their mothers based on the Picture Exchange Communication System to confront some basic disorders in a sample of children with autism. The study sample was (16) children with autism and their mothers in the different centers in Taif city and Tabuk city. The researcher used the quasi-experimental approach, in which two groups were employed: an experimental group and a control group. Children aged ranged from (6-9) years old. In addition, it was used the following tools: a list of estimation of basic disorders for a child with autism between (6-9) years, and a training program for children with autism
... Show More