Computer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a crucial technique in signal preprocessing, serving as key descriptors for signal analysis and recognition. OMs are obtained by the projection of orthogonal polynomials (OPs) onto the signal domain. However, when dealing with 3D signals, the traditional approach of convolving kernels with the signal and computing OMs beforehand significantly increases the computational cost of computer vision algorithms. To address this issue, this paper develops a novel mathematical model to embed the kernel directly into the OPs functions, seamlessly integrating these two processes into a more efficient and accurate approach. The proposed model allows the computation of OMs for smoothed versions of 3D signals directly, thereby reducing computational overhead. Extensive experiments conducted on 3D objects demonstrate that the proposed method outperforms traditional approaches across various metrics. The average recognition accuracy improves to 83.85% when the polynomial order is increased to 10. Experimental results show that the proposed method exhibits higher accuracy and lower computational costs compared to the benchmark methods in various conditions for a wide range of parameter values.
A resume is the first impression between you and a potential employer. Therefore, the importance of a resume can never be underestimated. Selecting the right candidates for a job within a company can be a daunting task for recruiters when they have to review hundreds of resumes. To reduce time and effort, we can use NLTK and Natural Language Processing (NLP) techniques to extract essential data from a resume. NLTK is a free, open source, community-driven project and the leading platform for building Python programs to work with human language data. To select the best resume according to the company’s requirements, an algorithm such as KNN is used. To be selected from hundreds of resumes, your resume must be one of the best. Theref
... Show MoreThe objective of this study is to apply Artificial Neural Network for heat transfer analysis of shell-and-tube heat exchangers widely used in power plants and refineries. Practical data was obtained by using industrial heat exchanger operating in power generation department of Dura refinery. The commonly used Back Propagation (BP) algorithm was used to train and test networks by divided the data to three samples (training, validation and testing data) to give more approach data with actual case. Inputs of the neural network include inlet water temperature, inlet air temperature and mass flow rate of air. Two outputs (exit water temperature to cooling tower and exit air temperature to second stage of air compressor) were taken in ANN.
... Show MoreRegression Discontinuity (RD) means a study that exposes a definite group to the effect of a treatment. The uniqueness of this design lies in classifying the study population into two groups based on a specific threshold limit or regression point, and this point is determined in advance according to the terms of the study and its requirements. Thus , thinking was focused on finding a solution to the issue of workers retirement and trying to propose a scenario to attract the idea of granting an end-of-service reward to fill the gap ( discontinuity point) if it had not been granted. The regression discontinuity method has been used to study and to estimate the effect of the end -service reward on the cutoff of insured workers as well as t
... Show MoreIn this study three reactive dyes (blue B, red R and yellow Y) in single , binary and ternary solution were adsorbed by activated carbon AC in equilibrium and kinetic experiments. Surface area, Bulk and real density, and porosity were carried out for the activated carbon.
Batch Experiments of pH (2.5-8.5) and initial concentration (5-100) mg/l were carried out for single solution for each dye. Experiments of adsorbent dosage effect (0.1-1)g per 100 ml were studied as a variable to evaluate uptake% and adsorption capacity for single dyes(5, 10) ppm, binary and ternary (10) ppm of mixture solutions solution of dyes. Langmuir, and Freundlich, models were used as Equilibrium isotherm models for single solution. Extended Langmuir and Freun
MB Mahmood, BN Dhannoon
CNC machines are widely used in production fields since they produce similar parts in a minimum time, at higher speed and with possibly minimum error. A control system is designed, implemented and tested to control the operation of a laboratory CNC milling machine having three axes that are moved by using a stepper motor attached to each axis. The control system includes two parts, hardware part and software part, the hardware part used a PC (works as controller) connected to the CNC machine through its parallel port by using designed interface circuit. The software part includes the algorithms needed to control the CNC. The sample needs to be machined is drawn by using one of the drawing software like AUTOCAD or 3D MAX and is saved in a we
... Show MoreMatching between wind site characteristics and wind turbine characteristics for three selected sites in Iraq was carried out. Site-turbine matching for potential wind power application in Iraq has not yet been well reported on. Thus, in this study, five years’ wind speed data for sites located in Baghdad (33.34N, 44.40E), Nasiriyah (31.05N, 46.25E), and Basrah (30.50N, 47.78E) were collected. A full wind energy analysis based on the measured data, Weibull distribution function, and wind turbine characteristics was made. A code developed using MATLAB software was used to analyse the wind energy and wind turbines models. The primary objective was to achieve a standard wind turbine-site matching based on the capacity factor. Another matching
... Show MoreIn this article, the nonlinear problem of Jeffery-Hamel flow has been solved analytically and numerically by using reliable iterative and numerical methods. The approximate solutions obtained by using the Daftardar-Jafari method namely (DJM), Temimi-Ansari method namely (TAM) and Banach contraction method namely (BCM). The obtained solutions are discussed numerically, in comparison with other numerical solutions obtained from the fourth order Runge-Kutta (RK4), Euler and previous analytic methods available in literature. In addition, the convergence of the proposed methods is given based on the Banach fixed point theorem. The results reveal that the presented methods are reliable, effective and applicable to solve other nonlinear problems.
... Show MoreThe style of Free-form Geometry (FFG) has emerged in contemporary architecture within the last three decades around the world through the progress of digital design tools and the development of constructive materials. FFG is considered as the hard efforts of several contemporary architects to release their products from familiar restrictions to discover new and unfamiliar styles under the perspective of innovation. Many contemporary architects seek to recognize their forms and facilitate dealing with according to specific dimensional rules. The main research problem is the lack of knowledge, in the field of architecture, in previous literature about the formation processes in achievin
This paper proposes two hybrid feature subset selection approaches based on the combination (union or intersection) of both supervised and unsupervised filter approaches before using a wrapper, aiming to obtain low-dimensional features with high accuracy and interpretability and low time consumption. Experiments with the proposed hybrid approaches have been conducted on seven high-dimensional feature datasets. The classifiers adopted are support vector machine (SVM), linear discriminant analysis (LDA), and K-nearest neighbour (KNN). Experimental results have demonstrated the advantages and usefulness of the proposed methods in feature subset selection in high-dimensional space in terms of the number of selected features and time spe
... Show More