In this study, plain concrete simply supported beams subjected to two points loading were analyzed for the flexure. The numerical model of the beam was constructed in the meso-scale representation of concrete as a two phasic material (aggregate, and mortar). The fracture process of the concrete beams under loading was investigated in the laboratory as well as by the numerical models. The Extended Finite Element Method (XFEM) was employed for the treatment of the discontinuities that appeared during the fracture process in concrete. Finite element method with the feature standard/explicitlywas utilized for the numerical analysis. Aggregate particles were assumedof elliptic shape. Other properties such as grading and sizes of the aggregate particles were taken from standard laboratory tests that conducted on aggregate samples.Two different concrete beamswere experimentally and numerically investigated. The difference between beams was concentrated in the maximum size of aggregate particles. The comparison between experimental and numerical results showed that themeso-scale model gives a good interface for the representing the concrete models in numerical approach. It was concluded that the XFEM is a powerful technique to use for the analysis of the fracture process and crack propagation in concrete.
Reflection cracking in asphalt concrete (AC) overlays is a common form of pavement deterioration that occurs when underlying cracks and joints in the pavement structure propagate through an overlay due to thermal and traffic-induced movement, ultimately degrading the pavement’s lifespan and performance. This study aims to determine how alterations in overlay thickness and temperature conditions, the incorporation of chopped fibers, and the use of geotextiles influence the overlay’s capacity to postpone the occurrence of reflection cracking. To achieve the above objective, a total of 36 prism specimens were prepared and tested using an overlay testing machine (OTM). The variables considered in this study were the thickness of the
... Show MoreThe purpose of this paper is to identify the statistical indicators of the searched variables and identify the relationship between the cognitive learning outcome and the performance of the two mastering skills by parallel spherical standing and equilibrium on the balance beam. And the identification of the percentage of the cognitive learning outcome contribution to the performance of the two mastering skills by parallel spherical standing and the equilibrium on the balance beam. The two researchers used the descriptive approach in the survey method and the correlational relations, being the most appropriate to the nature of the research problem. The research community for the second stage students in the College of Physical Education and
... Show MoreEstimation of the unknown parameters in 2-D sinusoidal signal model can be considered as important and difficult problem. Due to the difficulty to find estimate of all the parameters of this type of models at the same time, we propose sequential non-liner least squares method and sequential robust M method after their development through the use of sequential approach in the estimate suggested by Prasad et al to estimate unknown frequencies and amplitudes for the 2-D sinusoidal compounds but depending on Downhill Simplex Algorithm in solving non-linear equations for the purpose of obtaining non-linear parameters estimation which represents frequencies and then use of least squares formula to estimate
... Show MoreThe aim of this paper is to propose an efficient three steps iterative method for finding the zeros of the nonlinear equation f(x)=0 . Starting with a suitably chosen , the method generates a sequence of iterates converging to the root. The convergence analysis is proved to establish its five order of convergence. Several examples are given to illustrate the efficiency of the proposed new method and its comparison with other methods.
Estimating multivariate location and scatter with both affine equivariance and positive break down has always been difficult. Awell-known estimator which satisfies both properties is the Minimum volume Ellipsoid Estimator (MVE) Computing the exact (MVE) is often not feasible, so one usually resorts to an approximate Algorithm. In the regression setup, algorithm for positive-break down estimators like Least Median of squares typically recomputed the intercept at each step, to improve the result. This approach is called intercept adjustment. In this paper we show that a similar technique, called location adjustment, Can be applied to the (MVE). For this purpose we use the Minimum Volume Ball (MVB). In order
... Show MoreIn this paper an algorithm for Steganography using DCT for cover image and DWT for hidden image with an embedding order key is proposed. For more security and complexity the cover image convert from RGB to YIQ, Y plane is used and divided into four equally parts and then converted to DCT domain. The four coefficient of the DWT of the hidden image are embedded into each part of cover DCT, the embedding order based on the order key of which is stored with cover in a database table in both the sender and receiver sender. Experimental results show that the proposed algorithm gets successful hiding information into the cover image. We use Microsoft Office Access 2003 database as DBMS, the hiding, extracting algo
... Show MoreThe aim of the thesis is to estimate the partial and inaccessible population groups, which is a field study to estimate the number of drug’s users in the Baghdad governorate for males who are (15-60) years old.
Because of the absence of data approved by government institutions, as well as the difficulty of estimating the numbers of these people from the traditional survey, in which the respondent expresses himself or his family members in some cases. In these challenges, the NSUM Network Scale-Up Method Is mainly based on asking respondents about the number of people they know in their network of drug addicts.
Based on this principle, a statistical questionnaire was designed to
... Show MoreIn this paper, the computational complexity will be reduced using a revised version of the selected mapping (SLM) algorithm. Where a partial SLM is achieved to reduce the mathematical operations around 50%. Although the peak to average power ratio (PAPR) reduction gain has been slightly degraded, the dramatic reduction in the computational complexity is an outshining achievement. Matlab simulation is used to evaluate the results, where the PAPR result shows the capability of the proposed method.
The basic solution to overcome difficult issues related to huge size of digital images is to recruited image compression techniques to reduce images size for efficient storage and fast transmission. In this paper, a new scheme of pixel base technique is proposed for grayscale image compression that implicitly utilize hybrid techniques of spatial modelling base technique of minimum residual along with transformed technique of Discrete Wavelet Transform (DWT) that also impels mixed between lossless and lossy techniques to ensure highly performance in terms of compression ratio and quality. The proposed technique has been applied on a set of standard test images and the results obtained are significantly encourage compared with Joint P
... Show MoreMerging biometrics with cryptography has become more familiar and a great scientific field was born for researchers. Biometrics adds distinctive property to the security systems, due biometrics is unique and individual features for every person. In this study, a new method is presented for ciphering data based on fingerprint features. This research is done by addressing plaintext message based on positions of extracted minutiae from fingerprint into a generated random text file regardless the size of data. The proposed method can be explained in three scenarios. In the first scenario the message was used inside random text directly at positions of minutiae in the second scenario the message was encrypted with a choosen word before ciphering
... Show More