Preferred Language
Articles
/
joe-2985
LDPC CODED MULTIUSER MC-CDMA PERFORMANCE OVERMULTIPATH RAYLEIGH FADING CHANNEL
...Show More Authors

This work presents the simulation of a Low density Parity Check (LDPC) coding scheme with
multiuserMulti-Carrier Code Division Multiple Access (MC-CDMA) system over Additive White
Gaussian Noise (AWGN) channel and multipath fading channels. The decoding technique used in
the simulation was iterative decoding since it gives maximum efficiency with ten iterations.
Modulation schemes that used are Phase Shift Keying (BPSK, QPSK and 16 PSK), along with the
Orthogonal Frequency Division Multiplexing (OFDM). A 12 pilot carrier were used in the estimator
to compensate channel effect. The channel model used is Long Term Evolution (LTE) channel with
Technical Specification TS 25.101v2.10 and 5 MHz bandwidth including the channels of indoor
to outdoor/ pedestrian channel and Vehicular channel

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
An Adaptive Harmony Search Part-of-Speech tagger for Square Hmong Corpus
...Show More Authors

Data-driven models perform poorly on part-of-speech tagging problems with the square Hmong language, a low-resource corpus. This paper designs a weight evaluation function to reduce the influence of unknown words. It proposes an improved harmony search algorithm utilizing the roulette and local evaluation strategies for handling the square Hmong part-of-speech tagging problem. The experiment shows that the average accuracy of the proposed model is 6%, 8% more than HMM and BiLSTM-CRF models, respectively. Meanwhile, the average F1 of the proposed model is also 6%, 3% more than HMM and BiLSTM-CRF models, respectively.

View Publication Preview PDF
Scopus (2)
Scopus Crossref
Publication Date
Thu Feb 01 2018
Journal Name
Journal Of Engineering
A Realistic Aggregate Load Representation for A Distribution Substation in Baghdad Network
...Show More Authors

Electrical distribution system loads are permanently not fixed and alter in value and nature with time. Therefore, accurate consumer load data and models are required for performing system planning, system operation, and analysis studies. Moreover, realistic consumer load data are vital for load management, services, and billing purposes. In this work, a realistic aggregate electric load model is developed and proposed for a sample operative substation in Baghdad distribution network. The model involves aggregation of hundreds of thousands of individual components devices such as motors, appliances, and lighting fixtures. Sana’a substation in Al-kadhimiya area supplies mainly residential grade loads. Measurement-based

... Show More
View Publication Preview PDF
Publication Date
Sun Jul 09 2023
Journal Name
Journal Of Engineering
Applying Cognitive Methodology in Designing On-Line Auto-Tuning Robust PID Controller for the Real Heating System
...Show More Authors

A novel design and implementation of a cognitive methodology for the on-line auto-tuning robust PID controller in a real heating system is presented in this paper. The aim of the proposed work is to construct a cognitive control methodology that gives optimal control signal to the heating system, which achieve the following objectives: fast and precise search efficiency in finding the on- line optimal PID controller parameters in order to find the optimal output temperature response for the heating system. The cognitive methodology (CM) consists of three engines: breeding engine based Routh-Hurwitz criterion stability, search engine based particle
swarm optimization (PSO) and aggregation knowledge engine based cultural algorithm (CA)

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Fri Sep 30 2022
Journal Name
Journal Of Economics And Administrative Sciences
Estimation of Reliability through the Wiener Degradation Process Based on the Genetic Algorithm to Estimating Parameters
...Show More Authors

      In this paper, the researcher suggested using the Genetic algorithm method to estimate the parameters of the Wiener degradation process,  where it is based on the Wiener process in order to estimate the reliability of high-efficiency products, due to the difficulty of estimating the reliability of them using traditional techniques that depend only on the failure times of products. Monte Carlo simulation has been applied for the purpose of proving the efficiency of the proposed method in estimating parameters; it was compared with the method of the maximum likelihood estimation. The results were that the Genetic algorithm method is the best based on the AMSE comparison criterion, then the reliab

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Dec 18 2017
Journal Name
Al-khwarizmi Engineering Journal
Path Planning of an Autonomous Mobile Robot using Enhanced Bacterial Foraging Optimization Algorithm
...Show More Authors

This paper describes the problem of online autonomous mobile robot path planning, which is consisted of finding optimal paths or trajectories for an autonomous mobile robot from a starting point to a destination across a flat map of a terrain, represented by a 2-D workspace. An enhanced algorithm for solving the problem of path planning using Bacterial Foraging Optimization algorithm is presented. This nature-inspired metaheuristic algorithm, which imitates the foraging behavior of E-coli bacteria, was used to find the optimal path from a starting point to a target point. The proposed algorithm was demonstrated by simulations in both static and dynamic different environments. A comparative study was evaluated between the developed algori

... Show More
View Publication Preview PDF
Crossref (13)
Crossref
Publication Date
Mon May 15 2017
Journal Name
Journal Of Theoretical And Applied Information Technology
Anomaly detection in text data that represented as a graph using dbscan algorithm
...Show More Authors

Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the

... Show More
Preview PDF
Scopus (3)
Scopus
Publication Date
Wed May 03 2017
Journal Name
Ibn Al-haitham Journal For Pure And Applied Sciences
Designing Feed Forward Neural Network for Solving Linear VolterraIntegro-Differential Equations
...Show More Authors

The aim of this paper, is to design multilayer Feed Forward Neural Network(FFNN)to find the approximate solution of the second order linear Volterraintegro-differential equations with boundary conditions. The designer utilized to reduce the computation of solution, computationally attractive, and the applications are demonstrated through illustrative examples.

View Publication Preview PDF
Publication Date
Tue Jul 09 2024
Journal Name
Diagnostics
A Novel Hybrid Machine Learning-Based System Using Deep Learning Techniques and Meta-Heuristic Algorithms for Various Medical Datatypes Classification
...Show More Authors

Medicine is one of the fields where the advancement of computer science is making significant progress. Some diseases require an immediate diagnosis in order to improve patient outcomes. The usage of computers in medicine improves precision and accelerates data processing and diagnosis. In order to categorize biological images, hybrid machine learning, a combination of various deep learning approaches, was utilized, and a meta-heuristic algorithm was provided in this research. In addition, two different medical datasets were introduced, one covering the magnetic resonance imaging (MRI) of brain tumors and the other dealing with chest X-rays (CXRs) of COVID-19. These datasets were introduced to the combination network that contained deep lea

... Show More
View Publication
Scopus (2)
Crossref (2)
Scopus Clarivate Crossref
Publication Date
Mon Dec 18 2017
Journal Name
Al-khwarizmi Engineering Journal
Optimization and Prediction of Process Parameters in SPIF that Affecting on Surface Quality Using Simulated Annealing Algorithm
...Show More Authors

Incremental sheet metal forming is a modern technique of sheet metal forming in which a uniform sheet is locally deformed during the progressive action of a forming tool. The tool movement is governed by a CNC milling machine. The tool locally deforms by this way the sheet with pure deformation stretching. In SPIF process, the research is concentrate on the development of predict models for estimate the product quality. Using simulated annealing algorithm (SAA), Surface quality in SPIF has been modeled. In the development of this predictive model, spindle speed, feed rate and step depth have been considered as model parameters. Maximum peak height (Rz) and Arithmetic mean surface roughness (Ra) are used as response parameter to assess th

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
Optimizing Blockchain Consensus: Incorporating Trust Value in the Practical Byzantine Fault Tolerance Algorithm with Boneh-Lynn-Shacham Aggregate Signature
...Show More Authors

The consensus algorithm is the core mechanism of blockchain and is used to ensure data consistency among blockchain nodes. The PBFT consensus algorithm is widely used in alliance chains because it is resistant to Byzantine errors. However, the present PBFT (Practical Byzantine Fault Tolerance) still has issues with master node selection that is random and complicated communication. The IBFT consensus technique, which is enhanced, is proposed in this study and is based on node trust value and BLS (Boneh-Lynn-Shacham) aggregate signature. In IBFT, multi-level indicators are used to calculate the trust value of each node, and some nodes are selected to take part in network consensus as a result of this calculation. The master node is chosen

... Show More
View Publication Preview PDF
Scopus (1)
Scopus Crossref