Preferred Language
Articles
/
alkej-882
Development of an ANN Model for RGB Color Classification using the Dataset Extracted from a Fabricated Colorimeter
...Show More Authors

 

Codes of red, green, and blue data (RGB) extracted from a lab-fabricated colorimeter device were used to build a proposed classifier with the objective of classifying colors of objects based on defined categories of fundamental colors. Primary, secondary, and tertiary colors namely red, green, orange, yellow, pink, purple, blue, brown, grey, white, and black, were employed in machine learning (ML) by applying an artificial neural network (ANN) algorithm using Python. The classifier, which was based on the ANN algorithm, required a definition of the mentioned eleven colors in the form of RGB codes in order to acquire the capability of classification. The software's capacity to forecast the color of the code that belongs to an object under detection is one of the results of the proposed classifier. The work demanded the collection of about 5000 color codes which in turn were subjected to algorithms for training and testing. The open-source platform TensorFlow for ML and the open-source neural network library Keras were used to construct the algorithm for the study. The results showed an acceptable efficiency of the built classifier represented by an accuracy of 90% which can be considered applicable, especially after some improvements in the future to makes it more effective as a trusted colorimeter.

 

Crossref
View Publication Preview PDF
Quick Preview PDF
Publication Date
Tue Dec 05 2023
Journal Name
Baghdad Science Journal
Robust Color Image Encryption Scheme Based on RSA via DCT by Using an Advanced Logic Design Approach
...Show More Authors

Information security in data storage and transmission is increasingly important. On the other hand, images are used in many procedures. Therefore, preventing unauthorized access to image data is crucial by encrypting images to protect sensitive data or privacy. The methods and algorithms for masking or encoding images vary from simple spatial-domain methods to frequency-domain methods, which are the most complex and reliable. In this paper, a new cryptographic system based on the random key generator hybridization methodology by taking advantage of the properties of Discrete Cosine Transform (DCT) to generate an indefinite set of random keys and taking advantage of the low-frequency region coefficients after the DCT stage to pass them to

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Crossref
Publication Date
Thu Oct 01 2020
Journal Name
Journal Of Legal Sciences
Judicial tools in the development of civil law rules ( France as a model )
...Show More Authors

Despite the principle of separation of powers brought by the French Revolution, which entrusted the task of drafting legislation and its amendment to the legislative authority and the task of settling disputes and settling them in the judiciary. However, since that date, the French judiciary has played a major role in the development of French civil law (In spite of all the economic and social developments that have taken place in French society throughout these years) since its promulgation until February of 2016, the date of the Legislative Decree No. 131 of the year 2016 A modification is the largest in the history of the French Civil Code (which was the judicial precedents in which a significant impact), was assisted by the French judic

... Show More
View Publication
Publication Date
Fri Mar 01 2024
Journal Name
Baghdad Science Journal
Deep Learning Techniques in the Cancer-Related Medical Domain: A Transfer Deep Learning Ensemble Model for Lung Cancer Prediction
...Show More Authors

Problem: Cancer is regarded as one of the world's deadliest diseases. Machine learning and its new branch (deep learning) algorithms can facilitate the way of dealing with cancer, especially in the field of cancer prevention and detection. Traditional ways of analyzing cancer data have their limits, and cancer data is growing quickly. This makes it possible for deep learning to move forward with its powerful abilities to analyze and process cancer data. Aims: In the current study, a deep-learning medical support system for the prediction of lung cancer is presented. Methods: The study uses three different deep learning models (EfficientNetB3, ResNet50 and ResNet101) with the transfer learning concept. The three models are trained using a

... Show More
View Publication Preview PDF
Scopus (4)
Crossref (4)
Scopus Crossref
Publication Date
Fri Oct 30 2020
Journal Name
Journal Of Economics And Administrative Sciences
Institutional Performance Assessment using a model of the European Foundation for Quality Management (EFQM) A case study at an organization
...Show More Authors

The study aims to use the European Excellence Model (EFQM) in assessing the institutional performance of the National Center for Administrative Development and Information Technology in order to determine the gap between the actual reality of the performance of the Center and the standards adopted in the model, in order to know the extent to which the Center seeks to achieve excellence in performance to improve the level of services provided and the adoption of methods Modern and contemporary management in the evaluation of its institutional performance.

The problem of the study was the absence of an institutional performance evaluation system at the centre whereby weaknesses (areas of improvement) and st

... Show More
View Publication Preview PDF
Crossref
Publication Date
Wed Mar 01 2017
Journal Name
2017 Annual Conference On New Trends In Information & Communications Technology Applications (ntict)
An efficient color quantization using color histogram
...Show More Authors

View Publication
Scopus (6)
Crossref (5)
Scopus Crossref
Publication Date
Sat Oct 28 2023
Journal Name
Baghdad Science Journal
A Comparative Study on Association Rule Mining Algorithms on the Hospital Infection Control Dataset
...Show More Authors

Administrative procedures in various organizations produce numerous crucial records and data. These
records and data are also used in other processes like customer relationship management and accounting
operations.It is incredibly challenging to use and extract valuable and meaningful information from these data
and records because they are frequently enormous and continuously growing in size and complexity.Data
mining is the act of sorting through large data sets to find patterns and relationships that might aid in the data
analysis process of resolving business issues. Using data mining techniques, enterprises can forecast future
trends and make better business decisions.The Apriori algorithm has bee

... Show More
View Publication Preview PDF
Scopus (2)
Crossref (1)
Scopus Crossref
Publication Date
Fri Apr 12 2019
Journal Name
Journal Of Economics And Administrative Sciences
Importance of Banking Merger To Promote Iraqi Banks Faltering and Slow Using The Logistic Regression Model
...Show More Authors

Abstract

The research examined with the importance banking merger to address the situation of Troubled banks in Iraq, Through The use of Logistic Regression Model. . The study attempted to present a conceptual aspect of banking merger and logistic regression, as well as the applied aspect which includes a sample consisting of six private Iraqi banks, and the hypothesis of the study is that the promotion of mergers among banks has positive impacts on improving the efficiency of performance of troubled banks, which contributes to the increase of banking services, raise of their financial indicators and the high liquidity and profits of the new banking entity as it is a way to overcome the prevailing banking crises.

... Show More
View Publication Preview PDF
Crossref
Publication Date
Tue Mar 01 2011
Journal Name
Journal Of Economics And Administrative Sciences
Building a mathematical model for measuring and analyzing the general equilibrium in the Iraqi economy through the IS-lm-BP model
...Show More Authors

In order to achieve overall balance in the economy to be achieved in different markets and at one time (market commodity, monetary and labor market and the balance of payments and public budget), did not provide yet a model from which to determine the overall balance in the economy and the difficulty of finding the inter-relationship between all these markets and put them applied in the form of allowing the identification of balance in all markets at once.

One of the best models that have dealt with this subject is a model
(LM-BP-IS), who teaches balance in the commodity market and money market and balance of payments and the importance of this issue This research tries to shed light on the reality

... Show More
View Publication Preview PDF
Crossref
Publication Date
Sun Dec 07 2008
Journal Name
Baghdad Science Journal
Optimal Color Model for Information Hidingin Color Images
...Show More Authors

In present work the effort has been put in finding the most suitable color model for the application of information hiding in color images. We test the most commonly used color models; RGB, YIQ, YUV, YCbCr1 and YCbCr2. The same procedures of embedding, detection and evaluation were applied to find which color model is most appropriate for information hiding. The new in this work, we take into consideration the value of errors that generated during transformations among color models. The results show YUV and YIQ color models are the best for information hiding in color images.

View Publication Preview PDF
Crossref
Publication Date
Tue Dec 01 2020
Journal Name
Baghdad Science Journal
A Modified Support Vector Machine Classifiers Using Stochastic Gradient Descent with Application to Leukemia Cancer Type Dataset
...Show More Authors

Support vector machines (SVMs) are supervised learning models that analyze data for classification or regression. For classification, SVM is widely used by selecting an optimal hyperplane that separates two classes. SVM has very good accuracy and extremally robust comparing with some other classification methods such as logistics linear regression, random forest, k-nearest neighbor and naïve model. However, working with large datasets can cause many problems such as time-consuming and inefficient results. In this paper, the SVM has been modified by using a stochastic Gradient descent process. The modified method, stochastic gradient descent SVM (SGD-SVM), checked by using two simulation datasets. Since the classification of different ca

... Show More
View Publication Preview PDF
Scopus (10)
Crossref (6)
Scopus Clarivate Crossref