Three Seismic Attributes are used to enhance or delineate geologic feature that cannot be detected within seismic resolution limit. These are Instantaneous Amplitude, Instantaneous Phase and Instantaneous Frequency Attributes. These are applied along two defined picked surface horizons within 3D seismic data for an area in southern Iraq. Two geologic features are deduced, the first represents complex channel system at the top of Saadi Formation and the second represents submarine fan within Mishrif Formation. The semblances of these ancient geological features are dramatically enhanced by using flattening technique.
This paper includes the estimation of the scale parameter of weighted Rayleigh distribution using well-known methods of estimation (classical and Bayesian). The proposed estimators were compared using Monte Carlo simulation based on mean squared error (MSE) criteria. Then, all the results of simulation and comparisons were demonstrated in tables.
Economic performance is one of the most important indicators of economic activity and with the performance of the economy progress varied sources of output and increase economic growth rates and per capita national income, and to recover the business environment and increase investment rates and rising effectiveness of the financial and monetary institutions and credit market. Which leads to increased employment rates and reducing unemployment rates and the elimination of many of the social problems and improve the average per capita income as well as improve the level of national income.
The input / output tables is a technique mathematical indicates economic performance
... Show MoreIn this paper, Bayes estimators of Poisson distribution have been derived by using two loss functions: the squared error loss function and the proposed exponential loss function in this study, based on different priors classified as the two different informative prior distributions represented by erlang and inverse levy prior distributions and non-informative prior for the shape parameter of Poisson distribution. The maximum likelihood estimator (MLE) of the Poisson distribution has also been derived. A simulation study has been fulfilled to compare the accuracy of the Bayes estimates with the corresponding maximum likelihood estimate (MLE) of the Poisson distribution based on the root mean squared error (RMSE) for different cases of the
... Show MoreThe research aims to clarify the COBIT5 framework for IT governance and to develop of a criterion based on Balanced Scorecard that contributes in measuring the performance of IT governance. To achieve these goals, the researchers adopted the deductive approach in the design of balanced scorecard to measure the IT governance at the Bank of Baghdad that was chosen because it relied heavily on IT.
The research has reached a number of conclusions, the most important of which is that the performance of IT department in the Bank of Baghdad falls within the good level that requires constant monitoring, the most committed items of Balanced Scorecard by the Bank were customer, internal operation, growth and finally the financial item; IT
... Show MoreIn the present work, different remote sensing techniques have been used to analyze remote sensing data spectrally using ENVI software. The majority of algorithms used in the Spectral Processing can be organized as target detection, change detection and classification. In this paper several methods of target detection have been studied such as matched filter and constrained energy minimization.
The water body mapping have been obtained and the results showed changes on the study area through the period 1995-2000. Also the results that obtained from applying constrained energy minimization were more accurate than other method comparing with the real situation.
Data compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.