Orthogonal polynomials and their moments have significant role in image processing and computer vision field. One of the polynomials is discrete Hahn polynomials (DHaPs), which are used for compression, and feature extraction. However, when the moment order becomes high, they suffer from numerical instability. This paper proposes a fast approach for computing the high orders DHaPs. This work takes advantage of the multithread for the calculation of Hahn polynomials coefficients. To take advantage of the available processing capabilities, independent calculations are divided among threads. The research provides a distribution method to achieve a more balanced processing burden among the threads. The proposed methods are tested for various values of DHaPs parameters, sizes, and different values of threads. In comparison to the unthreaded situation, the results demonstrate an improvement in the processing time which increases as the polynomial size increases, reaching its maximum of 5.8 in the case of polynomial size and order of 8000 × 8000 (matrix size). Furthermore, the trend of continuously raising the number of threads to enhance performance is inconsistent and becomes invalid at some point when the performance improvement falls below the maximum. The number of threads that achieve the highest improvement differs according to the size, being in the range of 8 to 16 threads in 1000 × 1000 matrix size, whereas at 8000 × 8000 case it ranges from 32 to 160 threads.
The interest in the intellectual capital and its development is a civilized necessity imposed by the requirements of the times and cannot imagine an advanced society in its potential productivity in poor efficiency of human capital, and features the work environment change permanently, putting the management of financial companies against a constant challenge toward coping with new developments in this changing environment and this is not taken unless owned by these companies qualified human resources and the provision of Culture organizers have, which manifested itself with the research problem by the following two questions:
- Did the intellectual capital value specific financial and
Objectives: This Paper is an attempt to evaluate the services provided by the private hospitals
and to identify the strength and weakness in
their performance The results can be utilized in stating conclusion and recommendations to improve
and activate the role of private medical sector in society .
Methodology: A questionnaire has be designed for this purpose and distributed to ( 132 ) beneficiaries
mostly from Baghdad private hospitals .
Results: The paper has come out with many important results . Among These are the following :
* these who benefit from services provided by private hospitals believe that the good performance of
such hospital is not due to the medical services alone but also to scientific aspect
Abstract
Research aims : The aim of the research is to evaluate the reality of the inspection teams' work in the health institutions belonging to Dhi-Qar health office .
Purpose: This research seeks to present a point of view based on knowing the extent of health service quality in Dhi-Qar governorate and discover the role of the inspection teams in enhancing the health service.
Design / Methodology/ Approach: The experimental method has been used and the questionnaire has also been used to collect data in order to develop a reliable and correct measurement model for the research's variables . The research's hypotheses have been tested through using some statistical treat
... Show MoreThe subject of the Internet of Things is very important, especially at present, which is why it has attracted the attention of researchers and scientists due to its importance in human life. Through it, a person can do several things easily, accurately, and in an organized manner. The research addressed important topics, the most important of which are the concept of the Internet of Things, the history of its emergence and development, the reasons for its interest and importance, and its most prominent advantages and characteristics. The research sheds light on the structure of the Internet of Things, its structural components, and its most important components. The research dealt with the most important search engines in the Intern
... Show MoreCompressing the speech reduces the data storage requirements, leading to reducing the time of transmitting the digitized speech over long-haul links like internet. To obtain best performance in speech compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry.The MCT bases functions are derived from GHM bases function using 2D linear convolution .The fast computation algorithm methods introduced here added desirable features to the current transform. We further assess the performance of the MCT in speech compression application. This paper discusses the effect of using DWT and MCT (one and two dimension) on speech compression. DWT and MCT performances in terms of comp
... Show MoreThe penalized least square method is a popular method to deal with high dimensional data ,where the number of explanatory variables is large than the sample size . The properties of penalized least square method are given high prediction accuracy and making estimation and variables selection
At once. The penalized least square method gives a sparse model ,that meaning a model with small variables so that can be interpreted easily .The penalized least square is not robust ,that means very sensitive to the presence of outlying observation , to deal with this problem, we can used a robust loss function to get the robust penalized least square method ,and get robust penalized estimator and
... Show MoreDiabetes is one of the increasing chronic diseases, affecting millions of people around the earth. Diabetes diagnosis, its prediction, proper cure, and management are compulsory. Machine learning-based prediction techniques for diabetes data analysis can help in the early detection and prediction of the disease and its consequences such as hypo/hyperglycemia. In this paper, we explored the diabetes dataset collected from the medical records of one thousand Iraqi patients. We applied three classifiers, the multilayer perceptron, the KNN and the Random Forest. We involved two experiments: the first experiment used all 12 features of the dataset. The Random Forest outperforms others with 98.8% accuracy. The second experiment used only five att
... Show More
