This paper presents the finite strain results from seven oriented samples data on Tertiary sandstone of Muqdadiya Formation and (400) samples of pebbles and conglomerate of Bai –Hassan Formation at the southwestern limb of Al-Tib Anticline in the Southeastern part of Iraq. Measurement and analysis of finite strain are carried out including these rocks at fluvio- lacustrine environment. The present study followed Fry method. The computed strain was, in the form of ellipses, within three prepared perpendicular planes in a single sample and Center to Center method was used to determine the strain ratio of the these samples. The strain in the studied area is low, this is mainly due to the sampled rocks underwent brittle deformation during folding. The calculated axial ratios (Rs), (three dimensional orientations of the strain axes) of strained rocks are shown which is equivalent to the log Flinn diagram normally used for plotting strain ellipsoids and according to the stereographic projection method, the long axes of this ellipsoid are sub parallel to measured Al-Tib Anticline axis and the short axes are approximately perpendicular to the axis of Al- Tib Anticline in the study area. This orientation of these axes is related to the movement of the Arabian plate.
The aim of this paper is to propose an efficient three steps iterative method for finding the zeros of the nonlinear equation f(x)=0 . Starting with a suitably chosen , the method generates a sequence of iterates converging to the root. The convergence analysis is proved to establish its five order of convergence. Several examples are given to illustrate the efficiency of the proposed new method and its comparison with other methods.
This research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squa
... Show MoreNon uniform channelization is a crucial task in cognitive radio receivers for obtaining separate channels from the digitized wideband input signal at different intervals of time. The two main requirements in the channelizer are reconfigurability and low complexity. In this paper, a reconfigurable architecture based on a combination of Improved Coefficient Decimation Method (ICDM) and Coefficient Interpolation Method (CIM) is proposed. The proposed Hybrid Coefficient Decimation-Interpolation Method (HCDIM) based filter bank (FB) is able to realize the same number of channels realized using (ICDM) but with a maximum decimation factor divided by the interpolation factor (L), which leads to less deterioration in stop band at
... Show MoreThis paper proposed a new method to study functional non-parametric regression data analysis with conditional expectation in the case that the covariates are functional and the Principal Component Analysis was utilized to de-correlate the multivariate response variables. It utilized the formula of the Nadaraya Watson estimator (K-Nearest Neighbour (KNN)) for prediction with different types of the semi-metrics, (which are based on Second Derivative and Functional Principal Component Analysis (FPCA)) for measureing the closeness between curves. Root Mean Square Errors is used for the implementation of this model which is then compared to the independent response method. R program is used for analysing data. Then, when the cov
... Show MoreThe searching process using a binary codebook of combined Block Truncation Coding (BTC) method and Vector Quantization (VQ), i.e. a full codebook search for each input image vector to find the best matched code word in the codebook, requires a long time. Therefore, in this paper, after designing a small binary codebook, we adopted a new method by rotating each binary code word in this codebook into 900 to 2700 step 900 directions. Then, we systematized each code word depending on its angle to involve four types of binary code books (i.e. Pour when , Flat when , Vertical when, or Zigzag). The proposed scheme was used for decreasing the time of the coding pro
... Show More: Sound forecasts are essential elements of planning, especially for dealing with seasonality, sudden changes in demand levels, strikes, large fluctuations in the economy, and price-cutting manoeuvres for competition. Forecasting can help decision maker to manage these problems by identifying which technologies are appropriate for their needs. The proposal forecasting model is utilized to extract the trend and cyclical component individually through developing the Hodrick–Prescott filter technique. Then, the fit models of these two real components are estimated to predict the future behaviour of electricity peak load. Accordingly, the optimal model obtained to fit the periodic component is estimated using spectrum analysis and Fourier mod
... Show MoreA sensitivity-turbidimetric method at (0-180o) was used for detn. of mebeverine in drugs by two solar cell and six source with C.F.I.A.. The method was based on the formation of ion pair for the pinkish banana color precipitate by the reaction of Mebeverine hydrochloride with Phosphotungstic acid. Turbidity was measured via the reflection of incident light that collides on the surface particles of precipitated at 0-180o. All variables were optimized. The linearity ranged of Mebeverine hydrochloride was 0.05-12.5mmol.L-1, the L.D. (S/N= 3)(3SB) was 521.92 ng/sample depending on dilution for the minimum concentration , with correlation coefficient r = 0.9966while was R.S.D%
... Show MoreFeature extraction provide a quick process for extracting object from remote sensing data (images) saving time to urban planner or GIS user from digitizing hundreds of time by hand. In the present work manual, rule based, and classification methods have been applied. And using an object- based approach to classify imagery. From the result, we obtained that each method is suitable for extraction depending on the properties of the object, for example, manual method is convenient for object, which is clear, and have sufficient area, also choosing scale and merge level have significant effect on the classification process and the accuracy of object extraction. Also from the results the rule-based method is more suitable method for extracting
... Show MoreIn this work, we employ a new normalization Bernstein basis for solving linear Freadholm of fractional integro-differential equations nonhomogeneous of the second type (LFFIDEs). We adopt Petrov-Galerkian method (PGM) to approximate solution of the (LFFIDEs) via normalization Bernstein basis that yields linear system. Some examples are given and their results are shown in tables and figures, the Petrov-Galerkian method (PGM) is very effective and convenient and overcome the difficulty of traditional methods. We solve this problem (LFFIDEs) by the assistance of Matlab10.
Nanosilica was extracted from rice husk, which was locally collected from the Iraqi mill at Al-Mishikhab district in Najaf Governorate, Iraq. The precipitation method was used to prepared Nanosilica powder from rice husk ash, after treating it thermally at 700°C, followed by dissolving the silica in the alkaline solution and getting a sodium silicate solution. Two samples of the final solution were collected to study the effect of filtration on the purity of the sample by X-ray fluorescence spectrometry (XRF). The result shows that the filtered samples have purity above while the non-filtered sample purity was around The structure analysis investigated by the X-ray diffraction (XRD), found that the Nanosilica powder has an amorphous
... Show More