Microservice architecture offers many advantages, especially for business applications, due to its flexibility, expandability, and loosely coupled structure for ease of maintenance. However, there are several disadvantages that stem from the features of microservices, such as the fact that microservices are independent in nature can hinder meaningful communication and make data synchronization more challenging. This paper addresses the issues by proposing a containerized microservices in an asynchronous event-driven architecture. This architecture encloses microservices in containers and implements an event manager to keep track of all the events in an event log to reduce errors in the application. Experiment results show a decline in response time compared to two other benchmark architectures, as well as a lessening in error rate.
In this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of Bayes est
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreThis paper is concerned with pre-test single and double stage shrunken estimators for the mean (?) of normal distribution when a prior estimate (?0) of the actule value (?) is available, using specifying shrinkage weight factors ?(?) as well as pre-test region (R). Expressions for the Bias [B(?)], mean squared error [MSE(?)], Efficiency [EFF(?)] and Expected sample size [E(n/?)] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants included in these expressions. Comparisons between suggested estimators, with respect to classical estimators in the sense of Bias and Relative Efficiency, are given. Furthermore, comparisons with the earlier existing works are drawn.
يتمتع العراق بموارد بشرية هائلة حيث يعد من البلدان الفتية، إلا أنه يعاني من أزمة رأس مال بشري تغذيها أزمة التعليم، ولكون التعليم أبرز مكونات رأس المال البشري فقد ذلك بشكل كبير على مؤشر رأس المال البشري في العراق، من هذا المنطلق وللدور الكبير الذي يلعبه الانفاق العام في أي مجال، جاءت هذه الدراسة للبحث في موضوع "الانفاق العام على التعليم ودوره في تحسين مؤشرات راس المال البشري التعليمية في العراق"، حيث هدف ه
... Show MoreThe research aims to identify the role of the dimensions of financial inclusion in achieving the competitive advantage by An exploratory research of the views of a sample of customers of the 20 Algerian commercial banks, And the relationship between its dimensions (Access dimension, Usage dimension, Quality) And competitive advantage. This research is based on the analytical descriptive approach. The questionnaire was adopted as a main tool in collecting data and information on the sample of 377.
The The research showed several results, the most important of which is a strong correlation between the dimensions of the three financial inclusion combined and the competitive advantage of the Algerian commercial banks, and explained t
... Show MoreThe main goal of this paper is to introduce and study a new concept named d*-supplemented which can be considered as a generalization of W- supplemented modules and d-hollow module. Also, we introduce a d*-supplement submodule. Many relationships of d*-supplemented modules are studied. Especially, we give characterizations of d*-supplemented modules and relationship between this kind of modules and other kind modules for example every d-hollow (d-local) module is d*-supplemented and by an example we show that the converse is not true.
This paper deals with constructing mixed probability distribution from exponential with scale parameter (β) and also Gamma distribution with (2,β), and the mixed proportions are ( .first of all, the probability density function (p.d.f) and also cumulative distribution function (c.d.f) and also the reliability function are obtained. The parameters of mixed distribution, ( ,β) are estimated by three different methods, which are maximum likelihood, and Moments method,as well proposed method (Differential Least Square Method)(DLSM).The comparison is done using simulation procedure, and all the results are explained in tables.
The estimation of the parameters of linear regression is based on the usual Least Square method, as this method is based on the estimation of several basic assumptions. Therefore, the accuracy of estimating the parameters of the model depends on the validity of these hypotheses. The most successful technique was the robust estimation method which is minimizing maximum likelihood estimator (MM-estimator) that proved its efficiency in this purpose. However, the use of the model becomes unrealistic and one of these assumptions is the uniformity of the variance and the normal distribution of the error. These assumptions are not achievable in the case of studying a specific problem that may include complex data of more than one model. To
... Show MoreThe research aims to identify the importance of applying resource consumption accounting in the Iraqi industrial environment in general, and oil in particular, and its role in reducing the costs of activities by excluding and isolating idle energy costs, as the research problem represents that the company faces deficiencies and challenges in applying strategic cost tools. The research was based on The hypothesis that the application of resource consumption accounting will lead to the provision of appropriate information for the company through the allocation of costs properly by resource consumption accounting and then reduce the costs of activities. To prove the hypothesis of the research, the Light Derivatives Authority - Al-Dora Refin
... Show MoreIn this paper, an algorithm for binary codebook design has been used in vector quantization technique, which is used to improve the acceptability of the absolute moment block truncation coding (AMBTC) method. Vector quantization (VQ) method is used to compress the bitmap (the output proposed from the first method (AMBTC)). In this paper, the binary codebook can be engender for many images depending on randomly chosen to the code vectors from a set of binary images vectors, and this codebook is then used to compress all bitmaps of these images. The chosen of the bitmap of image in order to compress it by using this codebook based on the criterion of the average bitmap replacement error (ABPRE). This paper is suitable to reduce bit rates
... Show More