Preferred Language
Articles
/
nRfpto0BVTCNdQwC6hmk
Building a High Accuracy Transfer Learning-Based Quality Inspection System at Low Costs
...Show More Authors

      Products’ quality inspection is an important stage in every production route, in which the quality of the produced goods is estimated and compared with the desired specifications. With traditional inspection, the process rely on manual methods that generates various costs and large time consumption. On the contrary, today’s inspection systems that use modern techniques like computer vision, are more accurate and efficient. However, the amount of work needed to build a computer vision system based on classic techniques is relatively large, due to the issue of manually selecting and extracting features from digital images, which also produces labor costs for the system engineers.       In this research, we present an adopted approach based on convolutional neural networks to design a system for quality inspection with high level of accuracy and low cost. The system is designed using transfer learning to transfer layers from a previously trained model and a fully connected neural network to classify the product’s condition into healthy or damaged. Helical gears were used as the inspected object and three cameras with differing resolutions were used to evaluate the system with colored and grayscale images. Experimental results showed high accuracy levels with colored images and even higher accuracies with grayscale images at every resolution, emphasizing the ability to build an inspection system at low costs, ease of construction and automatic extraction of image features.

Crossref
Preview PDF
Quick Preview PDF
Publication Date
Mon Jul 01 2019
Journal Name
International Journal Of Swarm Intelligence Research
A New Strategy Based on GSABAT to Solve Single Objective Optimization Problem
...Show More Authors

This article proposes a new strategy based on a hybrid method that combines the gravitational search algorithm (GSA) with the bat algorithm (BAT) to solve a single-objective optimization problem. It first runs GSA, followed by BAT as the second step. The proposed approach relies on a parameter between 0 and 1 to address the problem of falling into local research because the lack of a local search mechanism increases intensity search, whereas diversity remains high and easily falls into the local optimum. The improvement is equivalent to the speed of the original BAT. Access speed is increased for the best solution. All solutions in the population are updated before the end of the operation of the proposed algorithm. The diversification f

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (4)
Scopus Clarivate Crossref
Publication Date
Thu Mar 24 2022
Journal Name
Arab World English Journal
Collocation Networks of Selected Words in Academic Writing: A Corpus-Based Study
...Show More Authors

This study aims at shedding light on the linguistic significance of collocation networks in the academic writing context. Following Firth’s principle “You shall know a word by the company it keeps.” The study intends to examine three selected nodes (i.e. research, study, and paper) shared collocations in an academic context. This is achieved by using the corpus linguistic tool; GraphColl in #LancsBox software version 5 which was announced in June 2020 in analyzing selected nodes. The study focuses on academic writing of two corpora which were designed and collected especially to serve the purpose of the study. The corpora consist of a collection of abstracts extracted from two different academic journals that publish for writ

... Show More
View Publication
Clarivate Crossref
Publication Date
Tue Jan 08 2019
Journal Name
Iraqi Journal Of Physics
Refractive index sensor based on a solid-core photonic crystal fiber interferometer
...Show More Authors

Photonic crystal fiber interferometers are widely used for sensing applications. In this work, solid core-Photonic crystal fiber based on Mach-Zehnder modal interferometer for sensing refractive index was presented. The general structure of sensor applied by splicing short lengths of PCF in both sides with conventional single mode fiber (SMF-28). To apply modal interferometer theory; collapsing technique based on fusion splicing used to excite higher order modes (LP01 and LP11). Laser diode (1550 nm) has been used as a pump light source. Where a high sensitive optical spectrum analyzer (OSA) was used to monitor and record the transmitted. The experimental work shows that the interference spectrum of Photonic crystal fiber interferometer

... Show More
View Publication Preview PDF
Crossref (2)
Crossref
Publication Date
Mon Jan 02 2023
Journal Name
International Journal Of Nonlinear Analysis And Applications
Diagnostic COVID-19 based on chest imaging of COVID-19: A survey
...Show More Authors

Publication Date
Wed Dec 30 2020
Journal Name
Al-kindy College Medical Journal
A Population-Based Study on Agreement between Actual and Perceived Body Image
...Show More Authors

Background: Obesity tends to appear in modern societies and constitutes a significant public health problem with an increased risk of cardiovascular diseases.

Objective: This study aims to determine the agreement between actual and perceived body image in the general population.

Methods: A descriptive cross-sectional study design was conducted with a sample size of 300. The data were collected from eight major populated areas of Northern district of Karachi Sindh with a period of six months (10th January 2020 to 21st June 2020). The Figure rating questionnaire scale (FRS) was applied to collect the demographic data and perception about body weight. Body mass index (BMI) used for ass

... Show More
View Publication Preview PDF
Crossref
Publication Date
Mon Jan 27 2020
Journal Name
Iraqi Journal Of Science
A Framework of APT Detection Based on Packets Analysis and Host Destination
...Show More Authors

So far, APT (Advanced Persistent Threats) is a constant concern for information security. Despite that, many approaches have been used in order to detect APT attacks, such as change controlling, sandboxing and network traffic analysis. However, success of 100% couldn’t be achieved. Current studies have illustrated that APTs adopt many complex techniques to evade all detection types. This paper describes and analyzes APT problems by analyzing the most common techniques, tools and pathways used by attackers. In addition, it highlights the weaknesses and strengths of the existing security solutions that have been used since the threat was identified in 2006 until 2019. Furthermore, this research proposes a new framework that can be u

... Show More
View Publication Preview PDF
Scopus (5)
Crossref (4)
Scopus Crossref
Publication Date
Thu Feb 01 2018
Journal Name
Iet Signal Processing
Signal compression and enhancement using a new orthogonal‐polynomial‐based discrete transform
...Show More Authors

View Publication
Scopus (37)
Crossref (38)
Scopus Clarivate Crossref
Publication Date
Tue Dec 01 2009
Journal Name
Journal Of Lightwave Technology
A Random Number Generator Based on Single-Photon Avalanche Photodiode Dark Counts
...Show More Authors

View Publication
Scopus (19)
Crossref (16)
Scopus Clarivate Crossref
Publication Date
Thu Oct 01 2020
Journal Name
Engineering Science And Technology, An International Journal
Thermal performance improvement based on the hybrid design of a heat sink
...Show More Authors

View Publication
Scopus (2)
Crossref (1)
Scopus Clarivate Crossref
Publication Date
Sat Jul 31 2021
Journal Name
Iraqi Journal Of Science
A Parallel Clustering Analysis Based on Hadoop Multi-Node and Apache Mahout
...Show More Authors

     The conventional procedures of clustering algorithms are incapable of overcoming the difficulty of managing and analyzing the rapid growth of generated data from different sources. Using the concept of parallel clustering is one of the robust solutions to this problem. Apache Hadoop architecture is one of the assortment ecosystems that provide the capability to store and process the data in a distributed and parallel fashion. In this paper, a parallel model is designed to process the k-means clustering algorithm in the Apache Hadoop ecosystem by connecting three nodes, one is for server (name) nodes and the other two are for clients (data) nodes. The aim is to speed up the time of managing the massive sc

... Show More
View Publication Preview PDF
Scopus (3)
Crossref (1)
Scopus Crossref