Codes of red, green, and blue data (RGB) extracted from a lab-fabricated colorimeter device were used to build a proposed classifier with the objective of classifying colors of objects based on defined categories of fundamental colors. Primary, secondary, and tertiary colors namely red, green, orange, yellow, pink, purple, blue, brown, grey, white, and black, were employed in machine learning (ML) by applying an artificial neural network (ANN) algorithm using Python. The classifier, which was based on the ANN algorithm, required a definition of the mentioned eleven colors in the form of RGB codes in order to acquire the capability of classification. The software's capacity to forecast the color of the code that belongs to an object under detection is one of the results of the proposed classifier. The work demanded the collection of about 5000 color codes which in turn were subjected to algorithms for training and testing. The open-source platform TensorFlow for ML and the open-source neural network library Keras were used to construct the algorithm for the study. The results showed an acceptable efficiency of the built classifier represented by an accuracy of 90% which can be considered applicable, especially after some improvements in the future to makes it more effective as a trusted colorimeter.
This study uses an environmentally friendly and low-cost synthesis method to manufacture zinc oxide nanoparticles (ZnO NPs) by using zinc sulfate. Eucalyptus leaf extract is an effective chelating and capping agent for synthesizing ZnO NPs. The structure, morphology, thermal behavior, chemical composition, and optical properties of ZnO nanoparticles were studied utilizing FT-IR, FE-SEM, EDAX, AFM, and Zeta potential analysis. The FE-SEM pictures confirmed that the ZnO NPs with a size range of (22-37) nm were crystalline and spherical. Two methods were used to prepare ZnO NPs. The first method involved calcining the resulting ZnO NPs, while the second method did not. The prepared ZnO NPs were used as adsorbents for removing acid black 210
... Show MoreChange detection is a technology ascertaining the changes of
specific features within a certain time Interval. The use of remotely
sensed image to detect changes in land use and land cover is widely
preferred over other conventional survey techniques because this
method is very efficient for assessing the change or degrading trends
of a region. In this research two remotely sensed image of Baghdad
city gathered by landsat -7and landsat -8 ETM+ for two time period
2000 and 2014 have been used to detect the most important changes.
Registration and rectification the two original images are the first
preprocessing steps was applied in this paper. Change detection using
NDVI subtractive has been computed, subtrac
In this paper, we build a fuzzy classification system for classifying the nutritional status of children under 5 years old in Iraq using the Mamdani method based on input variables such as weight and height to determine the nutritional status of the child. Also, Classifying the nutritional status faces a difficult challenge in the medical field due to uncertainty and ambiguity in the variables and attributes that determine the categories of nutritional status for children, which are relied upon in medical diagnosis to determine the types of malnutrition problems and identify the categories or groups suffering from malnutrition to determine the risks faced by each group or category of children. Malnutrition in children is one of the most
... Show MoreOne of the recent significant but challenging research studies in computational biology and bioinformatics is to unveil protein complexes from protein-protein interaction networks (PPINs). However, the development of a reliable algorithm to detect more complexes with high quality is still ongoing in many studies. The main contribution of this paper is to improve the effectiveness of the well-known modularity density ( ) model when used as a single objective optimization function in the framework of the canonical evolutionary algorithm (EA). To this end, the design of the EA is modified with a gene ontology-based mutation operator, where the aim is to make a positive collaboration between the modularity density model and the proposed
... Show MoreIn this paper we used frequentist and Bayesian approaches for the linear regression model to predict future observations for unemployment rates in Iraq. Parameters are estimated using the ordinary least squares method and for the Bayesian approach using the Markov Chain Monte Carlo (MCMC) method. Calculations are done using the R program. The analysis showed that the linear regression model using the Bayesian approach is better and can be used as an alternative to the frequentist approach. Two criteria, the root mean square error (RMSE) and the median absolute deviation (MAD) were used to compare the performance of the estimates. The results obtained showed that the unemployment rates will continue to increase in the next two decade
... Show MoreIn this research, Artificial Neural Networks (ANNs) technique was applied in an attempt to predict the water levels and some of the water quality parameters at Tigris River in Wasit Government for five different sites. These predictions are useful in the planning, management, evaluation of the water resources in the area. Spatial data along a river system or area at different locations in a catchment area usually have missing measurements, hence an accurate prediction. model to fill these missing values is essential.
The selected sites for water quality data prediction were Sewera, Numania , Kut u/s, Kut d/s, Garaf observation sites. In these five sites models were built for prediction of the water level and water quality parameters.
The electric submersible pump, also known as ESP, is a highly effective artificial lift method widely used in the oil industry due to its ability to deliver higher production rates compared to other artificial lift methods. In principle, ESP is a multistage centrifugal pump that converts kinetic energy into dynamic hydraulic pressure necessary to lift fluids at a higher rate with lower bottomhole pressure, especially in oil wells under certain bottomhole condition fluid, and reservoir characteristics. However, several factors and challenges can complicate the completion and optimum development of ESP deployed wells, which need to be addressed to optimize its performance by maximizing efficiency and minimizing costs and uncertainties. To
... Show MoreThe current research aims to shed light on the Global Reporting Initiative (GRI), which helps to report financial and non-financial information by economic units in general and listed on the Iraq Stock Exchange in particular. The research was based on a main premise that apply the criteria of the Global Reporting Initiative (GRI) would provide useful information to users to help them make appropriate decisions. To achieve the goal of the research, the descriptive analysis method was used, and quantitative analysis was used. At the level of the descriptive analysis method, a desk survey was conducted. As for the quantitative analysis, it relied on applied data through a questionnaire form (Questioners) as a research tool, and the
... Show Moremany painters tried to mix colors with Music by direct employment through colorful musical pieces or the use of multiple instruments and techniques , or vice versa, including the French artist )Robert Stroben(, he transferred the piece of music to be depicted on the painting and worked on the tones of music (Johann Sebastian Bach) by dropping the color on the lines of the musical scale, for example (the C tone) ranging from brown to red ( Tone La A) from gray to orange, and so on, the presence of links and similarity factors between the world of music and the world of colors facilitated the process of linking musical notes with colors, the most famous of which was presented by the world (Newton) in the circle of basic colors and linking
... Show MoreSupport vector machine (SVM) is a popular supervised learning algorithm based on margin maximization. It has a high training cost and does not scale well to a large number of data points. We propose a multiresolution algorithm MRH-SVM that trains SVM on a hierarchical data aggregation structure, which also serves as a common data input to other learning algorithms. The proposed algorithm learns SVM models using high-level data aggregates and only visits data aggregates at more detailed levels where support vectors reside. In addition to performance improvements, the algorithm has advantages such as the ability to handle data streams and datasets with imbalanced classes. Experimental results show significant performance improvements in compa
... Show More