Mersing is one of the places that have the potential for wind power development in Malaysia. Researchers often suggest it as an ideal place for generating electricity from wind power. However, before a location is chosen, several factors need to be considered. By analyzing the location ahead of time, resource waste can be avoided and maximum profitability to various parties can be realized. For this study, the focus is to identify the distribution of the wind speed of Mersing and to determine the optimal average of wind speed. This study is critical because the wind speed data for any region has its distribution. It changes daily and by season. Moreover, no determination has been made regarding selecting the average wind speed used for wind studies. The wind speed data is averaged to 1, 10, 30, and 60 minutes and used to find the optimal wind speed average. This study used Kolmogorov-Smirnov and Chi-Square as the goodness of fit. The finding shows that the wind speed distribution in Mersing varies according to the time average used and the best fit distribution is Gen. Gamma. In contrast, the optimal average wind speed is 10 minutes due to the highest similarity results with 1-minute data. These affect the reliability of the finding, accuracy of the estimation and decisions made. Therefore, the implementation of this study is significant so that the wind distribution in a particular area is more accurate.
Service companies always working to increase their competitiveness and efficiency of the survival, growth and development in the present and the future, including insurance companies, so as to increase competition, especially after the emergence of the civil companies dramatically after 2003, has become the goal of recent studies carried out by the organizations is the human capital, because its success and excellence and achievement its objectives depends on the human, and this does not mean that the other components are not important, but also rely on the rights which has increased interest in the search for the way in which they can guide the behaviors and values and their own language in line with the organization's strategy an
... Show MorePurpose: The research aims to estimate models representing phenomena that follow the logic of circular (angular) data, accounting for the 24-hour periodicity in measurement. Theoretical framework: The regression model is developed to account for the periodic nature of the circular scale, considering the periodicity in the dependent variable y, the explanatory variables x, or both. Design/methodology/approach: Two estimation methods were applied: a parametric model, represented by the Simple Circular Regression (SCR) model, and a nonparametric model, represented by the Nadaraya-Watson Circular Regression (NW) model. The analysis used real data from 50 patients at Al-Kindi Teaching Hospital in Baghdad. Findings: The Mean Circular Erro
... Show MoreThe influence of culture on accounting systems and practices, including financial reports and accounting information through the values identified by Gray and derived from social-cultural values, and the four accounting values were derived from generally accepted accounting principles represented by (Conservatism, Uniformity, Secrecy, and Professionalism). Important and significant in maximizing financial performance, and measuring the extent of the role of these values in improving financial performance through attention to the values of accounting culture, this research
... Show MoreIn data mining, classification is a form of data analysis that can be used to extract models describing important data classes. Two of the well known algorithms used in data mining classification are Backpropagation Neural Network (BNN) and Naïve Bayesian (NB). This paper investigates the performance of these two classification methods using the Car Evaluation dataset. Two models were built for both algorithms and the results were compared. Our experimental results indicated that the BNN classifier yield higher accuracy as compared to the NB classifier but it is less efficient because it is time-consuming and difficult to analyze due to its black-box implementation.
Anomaly detection is still a difficult task. To address this problem, we propose to strengthen DBSCAN algorithm for the data by converting all data to the graph concept frame (CFG). As is well known that the work DBSCAN method used to compile the data set belong to the same species in a while it will be considered in the external behavior of the cluster as a noise or anomalies. It can detect anomalies by DBSCAN algorithm can detect abnormal points that are far from certain set threshold (extremism). However, the abnormalities are not those cases, abnormal and unusual or far from a specific group, There is a type of data that is do not happen repeatedly, but are considered abnormal for the group of known. The analysis showed DBSCAN using the
... Show MoreThis paper present the fast and robust approach of English text encryption and decryption based on Pascal matrix. The technique of encryption the Arabic or English text or both and show the result when apply this method on plain text (original message) and how will form the intelligible plain text to be unintelligible plain text in order to secure information from unauthorized access and from steel information, an encryption scheme usually uses a pseudo-random enecryption key generated by an algorithm. All this done by using Pascal matrix. Encryption and decryption are done by using MATLAB as programming language and notepad ++to write the input text.This paper present the fast and robust approach of English text encryption and decryption b
... Show MoreIn data transmission a change in single bit in the received data may lead to miss understanding or a disaster. Each bit in the sent information has high priority especially with information such as the address of the receiver. The importance of error detection with each single change is a key issue in data transmission field.
The ordinary single parity detection method can detect odd number of errors efficiently, but fails with even number of errors. Other detection methods such as two-dimensional and checksum showed better results and failed to cope with the increasing number of errors.
Two novel methods were suggested to detect the binary bit change errors when transmitting data in a noisy media.Those methods were: 2D-Checksum me
Often there is no well drilling without problems. The solution lies in managing and evaluating these problems and developing strategies to manage and scale them. Non-productive time (NPT) is one of the main causes of delayed drilling operations. Many events or possibilities can lead to a halt in drilling operations or a marginal decrease in the advancement of drilling, this is called (NPT). Reducing NPT has an important impact on the total expenditure, time and cost are considered one of the most important success factors in the oil industry. In other words, steps must be taken to investigate and eliminate loss of time, that is, unproductive time in the drilling rig in order to save time and cost and reduce wasted time. The data of
... Show MoreIt is well- known that the distinguished scholastic journal is a crucial cornerstone, which contributes to the scientific integrity of a particular academic institution. The establishment of the Al-Kindy College of Medicine (AKCM), University of Baghdad, in 1998 urged the need to issue Al-Kindy College Medical Journal (KCMJ).