Computer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a crucial technique in signal preprocessing, serving as key descriptors for signal analysis and recognition. OMs are obtained by the projection of orthogonal polynomials (OPs) onto the signal domain. However, when dealing with 3D signals, the traditional approach of convolving kernels with the signal and computing OMs beforehand significantly increases the computational cost of computer vision algorithms. To address this issue, this paper develops a novel mathematical model to embed the kernel directly into the OPs functions, seamlessly integrating these two processes into a more efficient and accurate approach. The proposed model allows the computation of OMs for smoothed versions of 3D signals directly, thereby reducing computational overhead. Extensive experiments conducted on 3D objects demonstrate that the proposed method outperforms traditional approaches across various metrics. The average recognition accuracy improves to 83.85% when the polynomial order is increased to 10. Experimental results show that the proposed method exhibits higher accuracy and lower computational costs compared to the benchmark methods in various conditions for a wide range of parameter values.
Machine learning has a significant advantage for many difficulties in the oil and gas industry, especially when it comes to resolving complex challenges in reservoir characterization. Permeability is one of the most difficult petrophysical parameters to predict using conventional logging techniques. Clarifications of the work flow methodology are presented alongside comprehensive models in this study. The purpose of this study is to provide a more robust technique for predicting permeability; previous studies on the Bazirgan field have attempted to do so, but their estimates have been vague, and the methods they give are obsolete and do not make any concessions to the real or rigid in order to solve the permeability computation. To
... Show MoreRadiotherapy is medical use of ionizing radiation, and commonly applied to the cancerous tumor because of its ability to control cell growth. The amount of radiation used in photon radiation therapy called dose (measured in grey unit), which depend on the type and stage of cancer being treated. In our work, we studied the dose distribution given to the tumor at different depths (zero-20 cm) treated with different field size (4×4- 23×23 cm). Results show that the deeper treated area has less dose rate at the same beam quality and quantity. Also it has been noted increasing in the field increasing in the depth dose at the same depth even if the radiation energy is constant. Increasing in radiation dose attributed to the scattere
... Show MoreIn the present work, a kinetic study was performed to the extraction of phosphate from Iraqi Akashat phosphate ore using organic acid. Leaching was studied using lactic acid for the separation of calcareous materials (mainly calcite). Reaction conditions were 2% by weight acid concentration and 5ml/gm of acid volume to ore weight ratio. Reaction time was taken in the range 2 to 30 minutes (step 2 minutes) to determine the reaction rate constant k based on the change in calcite concentration. To determine value of activation energy when reaction temperature is varied from 25 to 65 , another investigation was accomplished. Through the kinetic data, it was found that selective leaching was controlled by surface chemical reactio
... Show MoreBig data analysis has important applications in many areas such as sensor networks and connected healthcare. High volume and velocity of big data bring many challenges to data analysis. One possible solution is to summarize the data and provides a manageable data structure to hold a scalable summarization of data for efficient and effective analysis. This research extends our previous work on developing an effective technique to create, organize, access, and maintain summarization of big data and develops algorithms for Bayes classification and entropy discretization of large data sets using the multi-resolution data summarization structure. Bayes classification and data discretization play essential roles in many learning algorithms such a
... Show MoreThe hydraulic conditions of a flow previously proved to be changed when placing large-scale geometric roughness elements on the bed of an open channel. These elements impose more resistance to the flow. The geometry of the roughness elements, the numbers used, and the configuration are parameters that can affect the hydraulic flow characteristics. The target is to use inclined block elements to control the salt wedge propagation pointed in most estuaries to prevent its negative effects. The Computational Fluid Dynamics CFD Software was used to simulate the two-phase flow in an estuary model. In this model, the used block elements are 2 cm by 3 cm cross-sections with an inclined face in the flow direction, with a length
... Show MoreData Driven Requirement Engineering (DDRE) represents a vision for a shift from the static traditional methods of doing requirements engineering to dynamic data-driven user-centered methods. Data available and the increasingly complex requirements of system software whose functions can adapt to changing needs to gain the trust of its users, an approach is needed in a continuous software engineering process. This need drives the emergence of new challenges in the discipline of requirements engineering to meet the required changes. The problem in this study was the method in data discrepancies which resulted in the needs elicitation process being hampered and in the end software development found discrepancies and could not meet the need
... Show More