Mersing is one of the places that have the potential for wind power development in Malaysia. Researchers often suggest it as an ideal place for generating electricity from wind power. However, before a location is chosen, several factors need to be considered. By analyzing the location ahead of time, resource waste can be avoided and maximum profitability to various parties can be realized. For this study, the focus is to identify the distribution of the wind speed of Mersing and to determine the optimal average of wind speed. This study is critical because the wind speed data for any region has its distribution. It changes daily and by season. Moreover, no determination has been made regarding selecting the average wind speed used for wind studies. The wind speed data is averaged to 1, 10, 30, and 60 minutes and used to find the optimal wind speed average. This study used Kolmogorov-Smirnov and Chi-Square as the goodness of fit. The finding shows that the wind speed distribution in Mersing varies according to the time average used and the best fit distribution is Gen. Gamma. In contrast, the optimal average wind speed is 10 minutes due to the highest similarity results with 1-minute data. These affect the reliability of the finding, accuracy of the estimation and decisions made. Therefore, the implementation of this study is significant so that the wind distribution in a particular area is more accurate.
This paper considers the maximum number of weekly cases and deaths caused by the COVID-19 pandemic in Iraq from its outbreak in February 2020 until the first of July 2022. Some probability distributions were fitted to the data. Maximum likelihood estimates were obtained and the goodness of fit tests were performed. Results revealed that the maximum weekly cases were best fitted by the Dagum distribution, which was accepted by three goodness of fit tests. The generalized Pareto distribution best fitted the maximum weekly deaths, which was also accepted by the goodness of fit tests. The statistical analysis was carried out using the Easy-Fit software and Microsoft Excel 2019.
In the last two decades, networks had been changed according to the rapid changing in its requirements. The current Data Center Networks have large number of hosts (tens or thousands) with special needs of bandwidth as the cloud network and the multimedia content computing is increased. The conventional Data Center Networks (DCNs) are highlighted by the increased number of users and bandwidth requirements which in turn have many implementation limitations. The current networking devices with its control and forwarding planes coupling result in network architectures are not suitable for dynamic computing and storage needs. Software Defined networking (SDN) is introduced to change this notion of traditional networks by decoupling control and
... Show MoreEncryption of data is translating data to another shape or symbol which enables people only with an access to the secret key or a password that can read it. The data which are encrypted are generally referred to as cipher text, while data which are unencrypted are known plain text. Entropy can be used as a measure which gives the number of bits that are needed for coding the data of an image. As the values of pixel within an image are dispensed through further gray-levels, the entropy increases. The aim of this research is to compare between CAST-128 with proposed adaptive key and RSA encryption methods for video frames to determine the more accurate method with highest entropy. The first method is achieved by applying the "CAST-128" and
... Show MoreData compression offers an attractive approach to reducing communication costs using available bandwidth effectively. It makes sense to pursue research on developing algorithms that can most effectively use available network. It is also important to consider the security aspect of the data being transmitted is vulnerable to attacks. The basic aim of this work is to develop a module for combining the operation of compression and encryption on the same set of data to perform these two operations simultaneously. This is achieved through embedding encryption into compression algorithms since both cryptographic ciphers and entropy coders bear certain resemblance in the sense of secrecy. First in the secure compression module, the given text is p
... Show MoreIn this study, we review the ARIMA (p, d, q), the EWMA and the DLM (dynamic linear moodelling) procedures in brief in order to accomdate the ac(autocorrelation) structure of data .We consider the recursive estimation and prediction algorithms based on Bayes and KF (Kalman filtering) techniques for correlated observations.We investigate the effect on the MSE of these procedures and compare them using generated data.
Using photo electrochemical etching technique (PEC), porous silicon (PS) layers were produced on n-type silicon (Si) wafers to generate porous silicon for n-type with an orientation of (111) The results of etching time were investigated at: (5,10,15 min). X-ray diffraction experiments revealed differences between the surface of the sample sheet and the synthesized porous silicon. The largest crystal size is (30 nm) and the lowest crystal size is (28.6 nm) The analysis of Atomic Force Microscopy (AFM) and Field Emission Scanning Electron Microscope (FESEM) were used to research the morphology of porous silicon layer. As etching time increased, AFM findings showed that root mean square (RMS) of roughness and po
... Show MoreIn this paper, the time-fractional Fisher’s equation (TFFE) is considered to exam the analytical solution using the Laplace q-Homotopy analysis method (Lq-HAM)â€. The Lq-HAM is a combined form of q-homotopy analysis method (q-HAM) and Laplace transform. The aim of utilizing the Laplace transform is to outdo the shortage that is mainly caused by unfulfilled conditions in the other analytical methods. The results show that the analytical solution converges very rapidly to the exact solution.
A skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.
The distribution of the intensity of the comet Ison C/2013 is studied by taking its histogram. This distribution reveals four distinct regions that related to the background, tail, coma and nucleus. One dimensional temperature distribution fitting is achieved by using two mathematical equations that related to the coordinate of the center of the comet. The quiver plot of the gradient of the comet shows very clearly that arrows headed towards the maximum intensity of the comet.