Symmetric cryptography forms the backbone of secure data communication and storage by relying on the strength and randomness of cryptographic keys. This increases complexity, enhances cryptographic systems' overall robustness, and is immune to various attacks. The present work proposes a hybrid model based on the Latin square matrix (LSM) and subtractive random number generator (SRNG) algorithms for producing random keys. The hybrid model enhances the security of the cipher key against different attacks and increases the degree of diffusion. Different key lengths can also be generated based on the algorithm without compromising security. It comprises two phases. The first phase generates a seed value that depends on producing a randomly predefined set of key numbers of size n via the Donald E. Knuths SRNG algorithm (subtractive method). The second phase uses the output key (or seed value) from the previous phase as input to the Latin square matrix (LSM) to formulate a new key randomly. To increase the complexity of the generated key, another new random key of the same length that fulfills Shannon’s principle of confusion and diffusion properties is XORed. Four test keys for each 128, 192,256,512, and 1024–bit length are used to evaluate the strength of the proposed model. The experimental results and security analyses revealed that all test keys met the statistical National Institute of Standards (NIST) standards and had high values for entropy values exceeding 0.98. The key length of the proposed model for n bits is 25*n, which is large enough to overcome brute-force attacks. Moreover, the generated keys are very sensitive to initial values, which increases the complexity against different attacks.
A condense study was done to compare between the ordinary estimators. In particular the maximum likelihood estimator and the robust estimator, to estimate the parameters of the mixed model of order one, namely ARMA(1,1) model.
Simulation study was done for a varieties the model. using: small, moderate and large sample sizes, were some new results were obtained. MAPE was used as a statistical criterion for comparison.
Finding communities of connected individuals in complex networks is challenging, yet crucial for understanding different real-world societies and their interactions. Recently attention has turned to discover the dynamics of such communities. However, detecting accurate community structures that evolve over time adds additional challenges. Almost all the state-of-the-art algorithms are designed based on seemingly the same principle while treating the problem as a coupled optimization model to simultaneously identify community structures and their evolution over time. Unlike all these studies, the current work aims to individually consider this three measures, i.e. intra-community score, inter-community score, and evolution of community over
... Show MoreFor a given loading, the stiffness of a plate or shell structure can be increased significantly by the addition of ribs or stiffeners. Hitherto, the optimization techniques are mainly on the sizing of the ribs. The more important issue of identifying the optimum location of the ribs has received little attention. In this investigation, finite element analysis has been achieved for the determination of the optimum locations of the ribs for a given set of design constraints. In the conclusion, the author underlines the optimum positions of the ribs or stiffeners which give the best results.
Simulation Study
Abstract :
Robust statistics Known as, Resistance to mistakes resulting of the deviation of Check hypotheses of statistical properties ( Adjacent Unbiased , The Efficiency of data taken from a wide range of probability distributions follow a normal distribution or a mixture of other distributions with different standard deviations.
power spectrum function lead to, President role in the analysis of Stationary random processes, organized according to time, may be discrete random variables or continuous. Measuring its total capacity as frequency function.
Estimation methods Share with
... Show MoreA novel fractal design scheme has been introduced in this paper to generate microstrip bandpass filter designs with miniaturized sizes for wireless applications. The presented fractal scheme is based on Minkowski-like prefractal geometry. The space-filling property and self-similarity of this fractal geometry has found to produce reduced size symmetrical structures corresponding to the successive iteration levels. The resulting filter designs are with sizes suitable for use in modern wireless communication systems. The performance of each of the generated bandpass filter structures up to the 2nd iteration has been analyzed using a method of moments (MoM) based software IE3D, which is widely adopted in microwave research and in
... Show MoreBroyden update is one of the one-rank updates which solves the unconstrained optimization problem but this update does not guarantee the positive definite and the symmetric property of Hessian matrix.
In this paper the guarantee of positive definite and symmetric property for the Hessian matrix will be established by updating the vector which represents the difference between the next gradient and the current gradient of the objective function assumed to be twice continuous and differentiable .Numerical results are reported to compare the proposed method with the Broyden method under standard problems.
The rise of edge-cloud continuum computing is a result of the growing significance of edge computing, which has become a complementary or substitute option for traditional cloud services. The convergence of networking and computers presents a notable challenge due to their distinct historical development. Task scheduling is a major challenge in the context of edge-cloud continuum computing. The selection of the execution location of tasks, is crucial in meeting the quality-of-service (QoS) requirements of applications. An efficient scheduling strategy for distributing workloads among virtual machines in the edge-cloud continuum data center is mandatory to ensure the fulfilment of QoS requirements for both customer and service provider. E
... Show MoreIn this research we present An idea of setting up same split plots experiments in many locations and many periods by Latin Square Design. This cases represents a modest contribution in area of design and analysis of experiments. we had written (theoretically) the general plans, the mathematical models for these experiments, and finding the derivations of EMS for each component (source) of sources of variation of the analysis of variance tables which uses for the statistical analysis for these expirements
In this research, a group of gray texture images of the Brodatz database was studied by building the features database of the images using the gray level co-occurrence matrix (GLCM), where the distance between the pixels was one unit and for four angles (0, 45, 90, 135). The k-means classifier was used to classify the images into a group of classes, starting from two to eight classes, and for all angles used in the co-occurrence matrix. The distribution of the images on the classes was compared by comparing every two methods (projection of one class onto another where the distribution of images was uneven, with one category being the dominant one. The classification results were studied for all cases using the confusion matrix between every
... Show More