Microservice architecture offers many advantages, especially for business applications, due to its flexibility, expandability, and loosely coupled structure for ease of maintenance. However, there are several disadvantages that stem from the features of microservices, such as the fact that microservices are independent in nature can hinder meaningful communication and make data synchronization more challenging. This paper addresses the issues by proposing a containerized microservices in an asynchronous event-driven architecture. This architecture encloses microservices in containers and implements an event manager to keep track of all the events in an event log to reduce errors in the application. Experiment results show a decline in response time compared to two other benchmark architectures, as well as a lessening in error rate.
Mixture experiments are response variables based on the proportions of component for this mixture. In our research we will compare the scheffʼe model with the kronecker model for the mixture experiments, especially when the experimental area is restricted.
Because of the experience of the mixture of high correlation problem and the problem of multicollinearity between the explanatory variables, which has an effect on the calculation of the Fisher information matrix of the regression model.
to estimate the parameters of the mixture model, we used the (generalized inverse ) And the Stepwise Regression procedure
... Show MoreThis paper is concerned with pre-test single and double stage shrunken estimators for the mean (?) of normal distribution when a prior estimate (?0) of the actule value (?) is available, using specifying shrinkage weight factors ?(?) as well as pre-test region (R). Expressions for the Bias [B(?)], mean squared error [MSE(?)], Efficiency [EFF(?)] and Expected sample size [E(n/?)] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants included in these expressions. Comparisons between suggested estimators, with respect to classical estimators in the sense of Bias and Relative Efficiency, are given. Furthermore, comparisons with the earlier existing works are drawn.
The need to exchange large amounts of real-time data is constantly increasing in wireless communication. While traditional radio transceivers are not cost-effective and their components should be integrated, software-defined radio (SDR) ones have opened up a new class of wireless technologies with high security. This study aims to design an SDR transceiver was built using one type of modulation, which is 16 QAM, and adding a security subsystem using one type of chaos map, which is a logistic map, because it is a very simple nonlinear dynamical equations that generate a random key and EXCLUSIVE OR with the originally transmitted data to protect data through the transmission. At th
... Show MoreThe estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show MoreA reliability system of the multi-component stress-strength model R(s,k) will be considered in the present paper ,when the stress and strength are independent and non-identically distribution have the Exponentiated Family Distribution(FED) with the unknown shape parameter α and known scale parameter λ equal to two and parameter θ equal to three. Different estimation methods of R(s,k) were introduced corresponding to Maximum likelihood and Shrinkage estimators. Comparisons among the suggested estimators were prepared depending on simulation established on mean squared error (MSE) criteria.
This paper is concerned with preliminary test single stage shrinkage estimators for the mean (q) of normal distribution with known variance s2 when a prior estimate (q0) of the actule value (q) is available, using specifying shrinkage weight factor y( ) as well as pre-test region (R). Expressions for the Bias, Mean Squared Error [MSE( )] and Relative Efficiency [R.Eff.( )] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants including in these expressions. Comparisons between suggested estimators with respect to usual estimators in the sense of Relative Efficiency are given. Furthermore, comparisons with the earlier existi
... Show MoreThe intellectual property of digital documents has been protected by using many methods of digital watermarking. Digital documents have been so much of advantages over print documents. Digital documents are less expensive and easy to store, transport, and searched compared to traditional print documents. But it has its owner limitation too. A simple image editor can be used to modify and make a forged document. Digital documents can be tampered easily. In order to utilize the whole benefits of digital document, these limitations have to overcome these limitations by embedding some text, logo sequence that identifies the owner of the document..
In this research LSB technique has been used
... Show MoreThis paper deals with constructing mixed probability distribution from exponential with scale parameter (β) and also Gamma distribution with (2,β), and the mixed proportions are ( .first of all, the probability density function (p.d.f) and also cumulative distribution function (c.d.f) and also the reliability function are obtained. The parameters of mixed distribution, ( ,β) are estimated by three different methods, which are maximum likelihood, and Moments method,as well proposed method (Differential Least Square Method)(DLSM).The comparison is done using simulation procedure, and all the results are explained in tables.
In this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show MoreThe contemporary business environment is witnessing increasing calls for modifications to the traditional cost system, and a trend towards adopting cost management techniques to provide appropriate financial and non-financial information for senior and executive departments, including the Resource Consumption Accounting (RCA) technique in question, which classifies costs into fixed and variable to support the decision-making process. Moreover, (RCA) combines two approaches to cost estimation, the first based on activity-based cost accounting (ABC) and the second on the German cost accounting method (GPK). The research aims to provide a conceptual vision for resource consumption accounting, after Considering it as an accounting te
... Show More