Preferred Language
Articles
/
joe-2253
Improvement of Soil by Using Polymer Fiber Materials Underneath Square Footing
...Show More Authors

The change in project cost, or cost growth, occurs from many factors, some of which are related to soil problem conditions that may occurs during construction and/or during site investigation period. This paper described a new soil improvement method with a minimum cost solution by using polymer fiber materials having a length of (3 cm) in both directions and (2.5 mm) in thickness, distributed in uniform medium dense .
sandy soil at different depths (B, 1.5B and 2B) below the footings. Three square footings has been used (5,7.5 and 10 cm) to carry the above investigation by using lever arm loading system design for such purposes.
These fibers were distributed from depth of (0.1B) below the footing base down to the investigated depth. It was found that the initial vertical settlement of footing was highly affected in the early stage of loading due to complex Soil-Fiber Mixture (SFM) below the footing. The failure load value for proposed model in any case of loading increased compared with the un-reinforced soil by increasing the depth of improving below the footing. The Bearing Capacity Ratio (BCR) for soil-fiber mixture has been increased by ratio of (1.4 to
2.5), (1.7 to 4.9), and (1.8 to 8) for footings (5, 7.5, and 10 cm) respectively. The yield load-settlement for soil-fiber mixture system started at settlement of about 1.1% B while the yield load in un-reinforced soil started at smaller percentage which reflects the benefits of using such fiber materialfor improving soil behavior. Comparison between experimental and predicted (calculated) settlement below the footings showed the difference in ranges were within accepted limits for foundation settlements design

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (4)
Scopus
Publication Date
Mon Jun 01 2020
Journal Name
Al-khwarizmi Engineering Journal
Developing a Prosthesis Design using A Gearbox to Replicate the Human Hand Mechanism
...Show More Authors

Prosthetic is an artificial tool that replaces a member of the human frame that is  absent because of ailment, damage, or distortion. The current research activities in Iraq draw interest to the upper limb discipline because of the growth in the number  of amputees. Thus, it becomes necessary to increase researches in this subject to help in reducing the struggling patients.  This paper describes the design and development of a prosthesis for people able and wear them from persons who have amputation in the hands. This design is composed of a hand with five fingers moving by means of a gearbox ism mechanism. The design of this artificial hand has 5 degrees of freedom. This artificial hand works based on the principle of &n

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Dec 27 2017
Journal Name
Al-khwarizmi Engineering Journal
Human Face Recognition Using GABOR Filter And Different Self Organizing Maps Neural Networks
...Show More Authors

 

This work implements the face recognition system based on two stages, the first stage is feature extraction stage and the second stage is the classification stage. The feature extraction stage consists of Self-Organizing Maps (SOM) in a hierarchical format in conjunction with Gabor Filters and local image sampling. Different types of SOM’s were used and a comparison between the results from these SOM’s was given.

The next stage is the classification stage, and consists of self-organizing map neural network; the goal of this stage is to find the similar image to the input image. The proposal method algorithm implemented by using C++ packages, this work is successful classifier for a face database consist of 20

... Show More
View Publication Preview PDF
Publication Date
Tue Aug 10 2021
Journal Name
Design Engineering
Lossy Image Compression Using Hybrid Deep Learning Autoencoder Based On kmean Clusteri
...Show More Authors

Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye

... Show More
Publication Date
Wed Jun 01 2022
Journal Name
Baghdad Science Journal
Variable Selection Using aModified Gibbs Sampler Algorithm with Application on Rock Strength Dataset
...Show More Authors

Variable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage

... Show More
View Publication Preview PDF
Scopus (6)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Sat Jun 26 2021
Journal Name
Asian Journal Of Civil Engineering
Using AHP to prioritize the corruption risk practices in the Iraqi construction sector
...Show More Authors

View Publication
Scopus (10)
Crossref (9)
Scopus Crossref
Publication Date
Fri Mar 31 2017
Journal Name
Al-khwarizmi Engineering Journal
Big-data Management using Map Reduce on Cloud: Case study, EEG Images' Data
...Show More Authors

Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG r

... Show More
View Publication Preview PDF
Crossref
Publication Date
Thu Sep 18 2025
Journal Name
Sustainable Engineering And Innovation
Using fruit fly and dragonfly optimization algorithms to estimate the Fama-MacBeth model
...Show More Authors

This research proposes the application of the dragonfly and fruit fly algorithms to enhance estimates generated by the Fama-MacBeth model and compares their performance in this context for the first time. To specifically improve the dragonfly algorithm's effectiveness, three parameter tuning approaches are investigated: manual parameter tuning (MPT), adaptive tuning by methodology (ATY), and a novel technique called adaptive tuning by performance (APT). Additionally, the study evaluates the estimation performance using kernel weighted regression (KWR) and explores how the dragonfly and fruit fly algorithms can be employed to enhance KWR. All methods are tested using data from the Iraq Stock Exchange, based on the Fama-French three-f

... Show More
View Publication
Scopus Crossref
Publication Date
Mon Aug 29 2022
Journal Name
Eurasian Chem. Commun.
Removing some alizarin dyes from an aqueous solution using a polyacrylic acid hydrogel
...Show More Authors

The present work utilizes polyacrylic acid beads (PAA) to remove Alizarin yellow R (AYR)] and Alizarin Red S (ARS) from its solution. The isotherms of adsorption were investigated and the factors that impact them, such as temperature, ionic strength effect, shaking effect, and wet PAA. The isotherms of adsorption of (ARS) were found obeys the Freundlich equation. The isotherms of adsorption of (AYR) were found obeys the Langmuir equation. At various temperatures, the adsorption process on (PAA) was investigated. According to our data, there is a positive correlation between the (ARS and AYR) adsorption on the PAA and temperature (Endothermic process). The computation of the thermodynamic functions (ΔH, ΔG, and ΔS) is based on the foregoi

... Show More
Scopus (1)
Scopus
Publication Date
Wed Dec 23 2020
Journal Name
2020 International Conference On Advanced Science And Engineering (icoase)
A Comparative Study Using LZW with Wavelet or DCT for Compressing Color Images
...Show More Authors

As a result of the significance of image compression in reducing the volume of data, the requirement for this compression permanently necessary; therefore, will be transferred more quickly using the communication channels and kept in less space in memory. In this study, an efficient compression system is suggested; it depends on using transform coding (Discrete Cosine Transform or bi-orthogonal (tap-9/7) wavelet transform) and LZW compression technique. The suggested scheme was applied to color and gray models then the transform coding is applied to decompose each color and gray sub-band individually. The quantization process is performed followed by LZW coding to compress the images. The suggested system was applied on a set of seven stand

... Show More
View Publication Preview PDF
Scopus (8)
Crossref (8)
Scopus Crossref