Computer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a crucial technique in signal preprocessing, serving as key descriptors for signal analysis and recognition. OMs are obtained by the projection of orthogonal polynomials (OPs) onto the signal domain. However, when dealing with 3D signals, the traditional approach of convolving kernels with the signal and computing OMs beforehand significantly increases the computational cost of computer vision algorithms. To address this issue, this paper develops a novel mathematical model to embed the kernel directly into the OPs functions, seamlessly integrating these two processes into a more efficient and accurate approach. The proposed model allows the computation of OMs for smoothed versions of 3D signals directly, thereby reducing computational overhead. Extensive experiments conducted on 3D objects demonstrate that the proposed method outperforms traditional approaches across various metrics. The average recognition accuracy improves to 83.85% when the polynomial order is increased to 10. Experimental results show that the proposed method exhibits higher accuracy and lower computational costs compared to the benchmark methods in various conditions for a wide range of parameter values.
In this work a study and calculation of the normal approach between two bodies,
spherical and rough flat surface, had been conducted by the aid of image processing
technique. Four kinds of metals of different work hardening index had been used as a
surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests. A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights
In this work a study and calculation of the normal approach between two bodies, spherical and rough flat surface, had been conducted by the aid of image processing technique. Four kinds of metals of different work hardening index had been used as a surface specimens and by capturing images of resolution of 0.006565 mm/pixel a good estimate of the normal approach may be obtained the compression tests had been done in strength of material laboratory in mechanical engineering department, a Monsanto tensometer had been used to conduct the indentation tests.
A light section measuring equipment microscope BK 70x50 was used to calculate the surface parameters of the texture profile like standard deviation of asperity peak heights, centre lin
Moderately, advanced national election technologies have improved political systems. As electronic voting (e-voting) systems advance, security threats like impersonation, ballot tampering, and result manipulation increase. These challenges are addressed through a review covering biometric authentication, watermarking, and blockchain technologies, each of which plays a crucial role in improving the security of e-voting systems. More precisely, the biometric authentication is being examined due to its ability in identify the voters and reducing the risks of impersonation. The study also explores the blockchain technology to decentralize the elections, enhance the transparency and ensure the prevention of any unauthorized alteration or
... Show MoreHas been studied both processes Almetzaz and extortion of a substance Alklanda Maysan different amounts of Alcaúlan Guy 70% alcohol solution using the method when the wavelength
Motifs template is the input for many bioinformatics systems such codons finding, transcription, transaction, sequential pattern miner, and bioinformatics databases analysis. The size of motifs arranged from one base up to several Mega bases, therefore, the typing errors increase according to the size of motifs. In addition, when the structures motifs are submitted to bioinformatics systems, the specifications of motifs components are required, i.e. the simple motifs, gaps, and the lower bound and upper bound of each gap. The motifs can be of DNA, RNA, or Protein. In this research, a motif parser and visualization module is designed depending on a proposed a context free grammar, CFG, and colors human recognition system. GFC describes the m
... Show MoreRock type identification is very important task in Reservoir characterization in order to constrict robust reservoir models. There are several approaches have been introduced to define the rock type in reservoirs and each approach should relate the geological and petrophysical properties, such that each rock type is proportional to a unique hydraulic flow unit. A hydraulic flow unit is a reservoir zone that is laterally and vertically has similar flow and bedding characteristics. According to effect of rock type in reservoir performance, many empirical and statistical approaches introduced. In this paper Cluster Analysis technique is used to identify the rock groups in tertiary reservoir for Khabaz oil field by analyses variation o
... Show MoreLearning the vocabulary of a language has great impact on acquiring that language. Many scholars in the field of language learning emphasize the importance of vocabulary as part of the learner's communicative competence, considering it the heart of language. One of the best methods of learning vocabulary is to focus on those words of high frequency. The present article is a corpus based approach to the study of vocabulary whereby the research data are analyzed quantitatively using the software program "AntWordprofiler". This program analyses new input research data in terms of already stored reliable corpora. The aim of this article is to find out whether the vocabularies used in the English textbook for Intermediate Schools in Iraq are con
... Show MoreAbstract
The aim of the current research is to identify the level of administrative applications of expert systems in educational leadership departments in light of the systems approach. To achieve the objectives of the research, the descriptive-analytical and survey method was adopted. The results showed that the level of availability of the knowledge base for expert systems in educational leadership departments (as inputs) was low. The level of availability of resources and software for expert systems in educational leadership departments (as transformational processes) came to be low, as well as the level of availability of the user interface for expert systems in educational leadership departments (as outputs
... Show MoreThe topic of urban transformations has attracted the attention of researchers as it is one of the basic issues through which cities can be transformed towards sustainability. A specific level of transformation levels according to a philosophical concept known as a crossing. This article has relied on a specific methodology that aims to find a new approach for urban transformation based on the crossing concept. This concept derives from philosophical entrances based on the concepts of (being, process, becoming, and integration). Four levels have been for the crossing are (normal, ascending, leap, and descending). Each of these levels includes specific characteristics that distinguish it. The results showed that there is no descending
... Show More