Preferred Language
Articles
/
lBYGvooBVTCNdQwC_6T6
Gama Platform Survey for Agent-Based Modelling
...Show More Authors

The agent-based modeling is currently utilized extensively to analyze complex systems. It supported such growth, because it was able to convey distinct levels of interaction in a complex detailed environment. Meanwhile, agent-based models incline to be progressively complex. Thus, powerful modeling and simulation techniques are needed to address this rise in complexity. In recent years, a number of platforms for developing agent-based models have been developed. Actually, in most of the agents, often discrete representation of the environment, and one level of interaction are presented, where two or three are regarded hardly in various agent-based models. The key issue is that modellers work in these areas is not assisted by simulation platforms. Therefore, GAMA modelling and simulation platform is designed to facilitate the development of spatialized, multi-paradigms and multi-scale models to address this challenge.This platform enables modelers to build explicit and multi-level models geographically (spatially). It includes notably effective Data Mining models and Geographic Information Systems (GIS) that make the effort to model and analyze easier. This study examines how this platform deals with these concerns and how modeler's tools are provided from the box. Also, shows its skills relating to the tight combination of 3D visualization, GIS data management, and multi-level modeling. Furthermore, several examples of GAMA-based projects that had been built for complex models.

View Publication
Publication Date
Tue Aug 10 2021
Journal Name
Design Engineering
Lossy Image Compression Using Hybrid Deep Learning Autoencoder Based On kmean Clusteri
...Show More Authors

Image compression plays an important role in reducing the size and storage of data while increasing the speed of its transmission through the Internet significantly. Image compression is an important research topic for several decades and recently, with the great successes achieved by deep learning in many areas of image processing, especially image compression, and its use is increasing Gradually in the field of image compression. The deep learning neural network has also achieved great success in the field of processing and compressing various images of different sizes. In this paper, we present a structure for image compression based on the use of a Convolutional AutoEncoder (CAE) for deep learning, inspired by the diversity of human eye

... Show More
Publication Date
Mon Mar 01 2021
Journal Name
Al-khwarizmi Engineering Journal
Building a High Accuracy Transfer Learning-Based Quality Inspection System at Low Costs
...Show More Authors

      Products’ quality inspection is an important stage in every production route, in which the quality of the produced goods is estimated and compared with the desired specifications. With traditional inspection, the process rely on manual methods that generates various costs and large time consumption. On the contrary, today’s inspection systems that use modern techniques like computer vision, are more accurate and efficient. However, the amount of work needed to build a computer vision system based on classic techniques is relatively large, due to the issue of manually selecting and extracting features from digital images, which also produces labor costs for the system engineers.

  &nbsp

... Show More
View Publication Preview PDF
Scopus (2)
Scopus Crossref
Publication Date
Thu Jun 16 2022
Journal Name
Al-khwarizmi Engineering Journal
Path Planning and Obstacle Avoidance of a Mobile Robot based on GWO Algorithm
...Show More Authors

planning is among the most significant in the field of robotics research.  As it is linked to finding a safe and efficient route in a cluttered environment for wheeled mobile robots and is considered a significant prerequisite for any such mobile robot project to be a success. This paper proposes the optimal path planning of the wheeled mobile robot with collision avoidance by using an algorithm called grey wolf optimization (GWO) as a method for finding the shortest and safe. The research goals in this study for identify the best path while taking into account the effect of the number of obstacles and design parameters on performance for the algorithm to find the best path. The simulations are run in the MATLAB environment to test the

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (4)
Scopus Crossref
Publication Date
Sun Jan 14 2018
Journal Name
Journal Of Engineering
Optimum Design of Power System Stabilizer based on Improved Ant Colony Optimization Algorithm
...Show More Authors

This paper presents an improved technique on Ant Colony Optimization (ACO) algorithm. The procedure is applied on Single Machine with Infinite Bus (SMIB) system with power system stabilizer (PSS) at three different loading regimes. The simulations are made by using MATLAB software. The results show that by using Improved Ant Colony Optimization (IACO) the system will give better performance with less number of iterations as it compared with a previous modification on ACO. In addition, the probability of selecting the arc depends on the best ant performance and the evaporation rate.

 

View Publication Preview PDF
Publication Date
Tue Jun 01 2021
Journal Name
Baghdad Science Journal
Synthesis, Characterization and Gas Sensor Application of New Composite Based on MWCNTs:CoPc:Metal Oxide
...Show More Authors

The synthesis of new substituted cobalt Phthalocyanine (CoPc) was carried out using starting materials Naphthalene-1,4,5, tetracarbonic acid dianhydride (NDI) employing dry process method. Metal oxides (MO) alloy of (60%Ni3O4 40%-Co3O4 ) have been functionalized with multiwall carbon nanotubes (F-MWCNTs) to produce (F-MWCNTs/MO) nanocomposite (E2) and mixed with  CoPc to yield (F-MWCNT/CoPc/MO) (E3). These composites were investigated using different analytical and spectrophotometric methods such as 1H-NMR (0-18 ppm), FTIR spectroscopy in the range of (400-4000cm-1), powder X-rays diffraction (PXRD, 2θ o = 10-80), Raman spectroscopy (0-4000 cm-1), and UV-Visib

... Show More
View Publication Preview PDF
Scopus (16)
Crossref (13)
Scopus Clarivate Crossref
Publication Date
Fri May 04 2018
Journal Name
Wireless Personal Communications
IFRS: An Indexed Face Recognition System Based on Face Recognition and RFID Technologies
...Show More Authors

View Publication
Scopus (10)
Crossref (8)
Scopus Clarivate Crossref
Publication Date
Wed Aug 28 2024
Journal Name
Mesopotamian Journal Of Cybersecurity
A Novel Anomaly Intrusion Detection Method based on RNA Encoding and ResNet50 Model
...Show More Authors

Cybersecurity refers to the actions that are used by people and companies to protect themselves and their information from cyber threats. Different security methods have been proposed for detecting network abnormal behavior, but some effective attacks are still a major concern in the computer community. Many security gaps, like Denial of Service, spam, phishing, and other types of attacks, are reported daily, and the attack numbers are growing. Intrusion detection is a security protection method that is used to detect and report any abnormal traffic automatically that may affect network security, such as internal attacks, external attacks, and maloperations. This paper proposed an anomaly intrusion detection system method based on a

... Show More
View Publication
Scopus (7)
Crossref (2)
Scopus Crossref
Publication Date
Wed Nov 01 2023
Journal Name
Journal Of Dentistry
The in-vitro development of novel enzyme-based chemo-mechanical caries removal agents
...Show More Authors

Objectives: Bromelain is a potent proteolytic enzyme that has a unique functionality makes it valuable for various therapeutic purposes. This study aimed to develop three novel formulations based on bromelain to be used as chemomechanical caries removal agents. Methods: The novel agents were prepared using different concentrations of bromelain (10–40 wt. %), with and without 0.1–0.3 wt. % chloramine T or 0.5–1.5 wt. % chlorhexidine (CHX). Based on the enzymatic activity test, three formulations were selected; 30 % bromelain (F1), 30 % bromelain-0.1 % chloramine (F2) and 30 % bromelain-1.5 % CHX (F3). The assessments included molecular docking, Fourier-transform infrared spectroscopy (FTIR), viscosity and pH measurements. The efficie

... Show More
Scopus (9)
Crossref (9)
Scopus Clarivate Crossref
Publication Date
Sun Feb 25 2024
Journal Name
Baghdad Science Journal
An exploratory study of history-based test case prioritization techniques on different datasets
...Show More Authors

In regression testing, Test case prioritization (TCP) is a technique to arrange all the available test cases. TCP techniques can improve fault detection performance which is measured by the average percentage of fault detection (APFD). History-based TCP is one of the TCP techniques that consider the history of past data to prioritize test cases. The issue of equal priority allocation to test cases is a common problem for most TCP techniques. However, this problem has not been explored in history-based TCP techniques. To solve this problem in regression testing, most of the researchers resort to random sorting of test cases. This study aims to investigate equal priority in history-based TCP techniques. The first objective is to implement

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Crossref
Publication Date
Mon Mar 01 2021
Journal Name
Al-khwarizmi Engineering Journal
Building a High Accuracy Transfer Learning-Based Quality Inspection System at Low Costs
...Show More Authors

      Products’ quality inspection is an important stage in every production route, in which the quality of the produced goods is estimated and compared with the desired specifications. With traditional inspection, the process rely on manual methods that generates various costs and large time consumption. On the contrary, today’s inspection systems that use modern techniques like computer vision, are more accurate and efficient. However, the amount of work needed to build a computer vision system based on classic techniques is relatively large, due to the issue of manually selecting and extracting features from digital images, which also produces labor costs for the system engineers.       In this research, we pr

... Show More
Preview PDF
Scopus (2)
Scopus Crossref