A simple setup of random number generator is proposed. The random number generation is based on the shot-noise fluctuations in a p-i-n photodiode. These fluctuations that are defined as shot noise are based on a stationary random process whose statistical properties reflect Poisson statistics associated with photon streams. It has its origin in the quantum nature of light and it is related to vacuum fluctuations. Two photodiodes were used and their shot noise fluctuations were subtracted. The difference was applied to a comparator to obtain the random sequence.
The present paper concerns with the problem of estimating the reliability system in the stress – strength model under the consideration non identical and independent of stress and strength and follows Lomax Distribution. Various shrinkage estimation methods were employed in this context depend on Maximum likelihood, Moment Method and shrinkage weight factors based on Monte Carlo Simulation. Comparisons among the suggested estimation methods have been made using the mean absolute percentage error criteria depend on MATLAB program.
The Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreFree-Space Optical (FSO) can provide high-speed communications when the effect of turbulence is not serious. However, Space-Time-Block-Code (STBC) is a good candidate to mitigate this seriousness. This paper proposes a hybrid of an Optical Code Division Multiple Access (OCDMA) and STBC in FSO communication for last mile solutions, where access to remote areas is complicated. The main weakness effecting a FSO link is the atmospheric turbulence. The feasibility of employing STBC in OCDMA is to mitigate these effects. The current work evaluates the Bit-Error-Rate (BER) performance of OCDMA operating under the scintillation effect, where this effect can be described by the gamma-gamma model. The most obvious finding to emerge from the analysis
... Show MoreSymmetric cryptography forms the backbone of secure data communication and storage by relying on the strength and randomness of cryptographic keys. This increases complexity, enhances cryptographic systems' overall robustness, and is immune to various attacks. The present work proposes a hybrid model based on the Latin square matrix (LSM) and subtractive random number generator (SRNG) algorithms for producing random keys. The hybrid model enhances the security of the cipher key against different attacks and increases the degree of diffusion. Different key lengths can also be generated based on the algorithm without compromising security. It comprises two phases. The first phase generates a seed value that depends on producing a rand
... Show MoreAn edge dominating set of a graph is said to be an odd (even) sum degree edge dominating set (osded (esded) - set) of G if the sum of the degree of all edges in X is an odd (even) number. The odd (even) sum degree edge domination number is the minimum cardinality taken over all odd (even) sum degree edge dominating sets of G and is defined as zero if no such odd (even) sum degree edge dominating set exists in G. In this paper, the odd (even) sum degree domination concept is extended on the co-dominating set E-T of a graph G, where T is an edge dominating set of G. The corresponding parameters co-odd (even) sum degree edge dominating set, co-odd (even) sum degree edge domination number and co-odd (even) sum degree edge domin
... Show MoreThe tax base is one of the bases of the technical organizing of taxes, and that a good selection of the tax base effects the outcome of the tax and its fairness, and with the expansion of the tax range results a dangerous phenomenon called tax evasion, which became threaten the economies of countries and this phenomenon prevents the achievement of the state to its economic, political and social objectives which seeks to resolve this phenomenon and identifying all human and material potential and realize the real reasons that lie behind it. The researcher found that tax authorities are weak in terms of it the technical material and financial abilities, the analysis of data show that then is a significant reve
... Show MoreThe problem of generated waste as a result of the implementation of construction projects, has been aggravated recently because of construction activity experienced by the world, especially Iraq, which is going through a period of reconstruction, where construction waste represents (20-40%) of the total generated waste and has a negative effect on the environment and economic side of the project. In addition, the rate of consumpted amounts of natural resources are estimated to be about 40% in the construction industry, so it became necessary to reduce waste and to be manage well. This study aims to identify the key factors affecting waste management through the various phases of the project, and this is accom
... Show MoreAutomatic Programming Assessment (APA) has been gaining lots of attention among researchers mainly to support automated grading and marking of students’ programming assignments or exercises systematically. APA is commonly identified as a method that can enhance accuracy, efficiency and consistency as well as providing instant feedback on students’ programming solutions. In achieving APA, test data generation process is very important so as to perform a dynamic testing on students’ assignment. In software testing field, many researches that focus on test data generation have demonstrated the successful of adoption of Meta-Heuristic Search Techniques (MHST) so as to enhance the procedure of deriving adequate test data for efficient t
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show More