Gypseous soil covers approximately 30% of Iraqi lands and is widely used in geotechnical and construction engineering as it is. The demand for residential complexes has increased, so one of the significant challenges in studying gypsum soil due to its unique behavior is understanding its interaction with foundations, such as strip and square footing. This is because there is a lack of experiments that provide total displacement diagrams or failure envelopes, which are well-considered for non-problematic soil. The aim is to address a comprehensive understanding of the micromechanical properties of dry, saturated, and treated gypseous sandy soils and to analyze the interaction of strip base with this type of soil using particle image velocimetry (PIV) measurement and Plaxis 3D simulation. The results showed that high-resolution digital cameras captured soil deformation using PIV, displacement fields, and velocity vectors were generated, which helped identify different sand movement zones. Further, PIV showed punching and general shear failure in uncontaminated and soaked contaminated gypsum soils, respectively. Moreover, the Plaxis results corresponded well with the PIV, as material behavior models are essentially simplified representations of the actual behavior of footing and soil. Understanding soil deformation behavior is crucial for accurate engineering calculations and designs, making these findings valuable for geotechnical and construction engineering applications. Doi: 10.28991/CEJ-2024-010-07-016 Full Text: PDF
This research aims to predict the value of the maximum daily loss that the fixed-return securities portfolio may suffer in Qatar National Bank - Syria, and for this purpose data were collected for risk factors that affect the value of the portfolio represented by the time structure of interest rates in the United States of America over the extended period Between 2017 and 2018, in addition to data related to the composition of the bonds portfolio of Qatar National Bank of Syria in 2017, And then employing Monte Carlo simulation models to predict the maximum loss that may be exposed to this portfolio in the future. The results of the Monte Carlo simulation showed the possibility of decreasing the value at risk in the future due to the dec
... Show MoreIt is the regression analysis is the foundation stone of knowledge of statistics , which mostly depends on the ordinary least square method , but as is well known that the way the above mentioned her several conditions to operate accurately and the results can be unreliable , add to that the lack of certain conditions make it impossible to complete the work and analysis method and among those conditions are the multi-co linearity problem , and we are in the process of detected that problem between the independent variables using farrar –glauber test , in addition to the requirement linearity data and the lack of the condition last has been resorting to the
... Show Moreيحتل موضوع الاستهلاك اهمية كبيرة في الدراسات الاقتصادية في حالتي السلم والحرب وذلك لارتباط هذا الموضوع بالانسان والمجتمع ولكونه احد مؤشرات مستوى الرفاهية الاقتصادية والاجتماعية وتزداد اهمية ضبط حركة هذا المتغير السلوكي والكمي في زمن الحرب اكثر مما هو عليه في حالة السلم، في هذا البحث تم استخدام بيانات احصائية عن الانفاق الاستهلاكي الخاص ونصيب الفرد من الدخل القومي اضافة الى الرقم القياسي لاسعار المس
... Show MoreThis paper determined the difference between the first image of the natural and the second infected image by using logic gates. The proposed algorithm was applied in the first time with binary image, the second time in the gray image, and in the third time in the color image. At start of proposed algorithm the process images by applying convolution to extended images with zero to obtain more vision and features then enhancements images by Edge detection filter (laplacion operator) and smoothing images by using mean filter ,In order to determine the change between the original image and the injury the logic gates applied specially X-OR gates . Applying the technique for tooth decay through this comparison can locate inj
... Show MoreAbstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Abstract
The objective of image fusion is to merge multiple sources of images together in such a way that the final representation contains higher amount of useful information than any input one.. In this paper, a weighted average fusion method is proposed. It depends on using weights that are extracted from source images using counterlet transform. The extraction method is done by making the approximated transformed coefficients equal to zero, then taking the inverse counterlet transform to get the details of the images to be fused. The performance of the proposed algorithm has been verified on several grey scale and color test images, and compared with some present methods.
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show More