Three-dimensional (3D) image and medical image processing, which are considered big data analysis, have attracted significant attention during the last few years. To this end, efficient 3D object recognition techniques could be beneficial to such image and medical image processing. However, to date, most of the proposed methods for 3D object recognition experience major challenges in terms of high computational complexity. This is attributed to the fact that the computational complexity and execution time are increased when the dimensions of the object are increased, which is the case in 3D object recognition. Therefore, finding an efficient method for obtaining high recognition accuracy with low computational complexity is essential. To this end, this paper presents an efficient method for 3D object recognition with low computational complexity. Specifically, the proposed method uses a fast overlapped technique, which deals with higher-order polynomials and high-dimensional objects. The fast overlapped block-processing algorithm reduces the computational complexity of feature extraction. This paper also exploits Charlier polynomials and their moments along with support vector machine (SVM). The evaluation of the presented method is carried out using a well-known dataset, the McGill benchmark dataset. Besides, comparisons are performed with existing 3D object recognition methods. The results show that the proposed 3D object recognition approach achieves high recognition rates under different noisy environments. Furthermore, the results show that the presented method has the potential to mitigate noise distortion and outperforms existing methods in terms of computation time under noise-free and different noisy environments.
Due to the difficulties that Iraqi students face when writing in the English language, this preliminary study aimed to improve students' writing skills by using online platforms remotely. Sixty first-year students from Al-Furat Al–Awsat Technical University participated in this study. Through these platforms, the researchers relied on stimuli, such as images, icons, and short titles to allow for deeper and more accurate participations. Data were collected through corrections, observations, and feedback from the researchers and peers. In addition, two pre and post-tests were conducted. The quantitative data were analysed by SPSS statistical Editor, whereas the qualitative data were analyzed using the Piot table, an Excel sheet. The resu
... Show MoreSteganography is a mean of hiding information within a more obvious form of
communication. It exploits the use of host data to hide a piece of information in such a way
that it is imperceptible to human observer. The major goals of effective Steganography are
High Embedding Capacity, Imperceptibility and Robustness. This paper introduces a scheme
for hiding secret images that could be as much as 25% of the host image data. The proposed
algorithm uses orthogonal discrete cosine transform for host image. A scaling factor (a) in
frequency domain controls the quality of the stego images. Experimented results of secret
image recovery after applying JPEG coding to the stego-images are included.
A LiF (TLD-700) PTFED disc has adiameter of (13mm) and thickness of (0.4mm) for study the response and sensetivity of this material for gamma and beta rays by using (TOLEDO) system from pitman company. In order to calibrate the system and studying the calibration factor. Discs were irradiated for Gamma and Beta rays and comparing with the theoretical doses. The exposure range is between 15×10-2 mGy to 1000×10-2 mGy. These doses are within the range of normal radiation field for workers.
This paper introduces a relation between resultant and the Jacobian determinant
by generalizing Sakkalis theorem from two polynomials in two variables to the case of (n) polynomials in (n) variables. This leads us to study the results of the type: , and use this relation to attack the Jacobian problem. The last section shows our contribution to proving the conjecture.
This paper adapted the neural network for the estimating of the direction of arrival (DOA). It uses an unsupervised adaptive neural network with GHA algorithm to extract the principal components that in turn, are used by Capon method to estimate the DOA, where by the PCA neural network we take signal subspace only and use it in Capon (i.e. we will ignore the noise subspace, and take the signal subspace only).
Median filter is adopted to match the noise statistics of the degradation seeking good quality smoothing images. Two methods are suggested in this paper(Pentagonal-Hexagonal mask and Scan Window Mask), the study involved modified median filter for improving noise suppression, the modification is considered toward more reliable results. Modification median filter (Pentagonal-Hexagonal mask) was found gave better results (qualitatively and quantitatively ) than classical median filters and another suggested method (Scan Window Mask), but this will be on the account of the time required. But sometimes when the noise is line type the cross 3x3 filter preferred to another one Pentagonal-Hexagonal with few variation. Scan Window Mask gave bett
... Show More