Microservice architecture offers many advantages, especially for business applications, due to its flexibility, expandability, and loosely coupled structure for ease of maintenance. However, there are several disadvantages that stem from the features of microservices, such as the fact that microservices are independent in nature can hinder meaningful communication and make data synchronization more challenging. This paper addresses the issues by proposing a containerized microservices in an asynchronous event-driven architecture. This architecture encloses microservices in containers and implements an event manager to keep track of all the events in an event log to reduce errors in the application. Experiment results show a decline in response time compared to two other benchmark architectures, as well as a lessening in error rate.
The research aims to identify the role of the dimensions of financial inclusion in achieving the competitive advantage by An exploratory research of the views of a sample of customers of the 20 Algerian commercial banks, And the relationship between its dimensions (Access dimension, Usage dimension, Quality) And competitive advantage. This research is based on the analytical descriptive approach. The questionnaire was adopted as a main tool in collecting data and information on the sample of 377.
The The research showed several results, the most important of which is a strong correlation between the dimensions of the three financial inclusion combined and the competitive advantage of the Algerian commercial banks, and explained t
... Show MoreThe research aims to identify the importance of applying resource consumption accounting in the Iraqi industrial environment in general, and oil in particular, and its role in reducing the costs of activities by excluding and isolating idle energy costs, as the research problem represents that the company faces deficiencies and challenges in applying strategic cost tools. The research was based on The hypothesis that the application of resource consumption accounting will lead to the provision of appropriate information for the company through the allocation of costs properly by resource consumption accounting and then reduce the costs of activities. To prove the hypothesis of the research, the Light Derivatives Authority - Al-Dora Refin
... Show MoreResearchers have increased interest in recent years in determining the optimum sample size to obtain sufficient accuracy and estimation and to obtain high-precision parameters in order to evaluate a large number of tests in the field of diagnosis at the same time. In this research, two methods were used to determine the optimum sample size to estimate the parameters of high-dimensional data. These methods are the Bennett inequality method and the regression method. The nonlinear logistic regression model is estimated by the size of each sampling method in high-dimensional data using artificial intelligence, which is the method of artificial neural network (ANN) as it gives a high-precision estimate commensurate with the dat
... Show MoreThe present paper agrees with estimation of scale parameter θ of the Inverted Gamma (IG) Distribution when the shape parameter α is known (α=1), bypreliminarytestsinglestage shrinkage estimators using suitable shrinkage weight factor and region. The expressions for the Bias, Mean Squared Error [MSE] for the proposed estimators are derived. Comparisons between the considered estimator with the usual estimator (MLE) and with the existing estimator are performed .The results are presented in attached tables.
This study includes Estimating scale parameter, location parameter and reliability function for Extreme Value (EXV) distribution by two methods, namely: -
- Maximum Likelihood Method (MLE).
- Probability Weighted Moments Method (PWM).
Used simulations to generate the required samples to estimate the parameters and reliability function of different sizes(n=10,25,50,100) , and give real values for the parameters are and , replicate the simulation experiments (RP=1000)
... Show MoreThis paper is concerned with pre-test single and double stage shrunken estimators for the mean (?) of normal distribution when a prior estimate (?0) of the actule value (?) is available, using specifying shrinkage weight factors ?(?) as well as pre-test region (R). Expressions for the Bias [B(?)], mean squared error [MSE(?)], Efficiency [EFF(?)] and Expected sample size [E(n/?)] of proposed estimators are derived. Numerical results and conclusions are drawn about selection different constants included in these expressions. Comparisons between suggested estimators, with respect to classical estimators in the sense of Bias and Relative Efficiency, are given. Furthermore, comparisons with the earlier existing works are drawn.
In this paper, Bayes estimators of the parameter of Maxwell distribution have been derived along with maximum likelihood estimator. The non-informative priors; Jeffreys and the extension of Jeffreys prior information has been considered under two different loss functions, the squared error loss function and the modified squared error loss function for comparison purpose. A simulation study has been developed in order to gain an insight into the performance on small, moderate and large samples. The performance of these estimators has been explored numerically under different conditions. The efficiency for the estimators was compared according to the mean square error MSE. The results of comparison by MSE show that the efficiency of Bayes est
... Show MoreMixture experiments are response variables based on the proportions of component for this mixture. In our research we will compare the scheffʼe model with the kronecker model for the mixture experiments, especially when the experimental area is restricted.
Because of the experience of the mixture of high correlation problem and the problem of multicollinearity between the explanatory variables, which has an effect on the calculation of the Fisher information matrix of the regression model.
to estimate the parameters of the mixture model, we used the (generalized inverse ) And the Stepwise Regression procedure
... Show MoreThe need to exchange large amounts of real-time data is constantly increasing in wireless communication. While traditional radio transceivers are not cost-effective and their components should be integrated, software-defined radio (SDR) ones have opened up a new class of wireless technologies with high security. This study aims to design an SDR transceiver was built using one type of modulation, which is 16 QAM, and adding a security subsystem using one type of chaos map, which is a logistic map, because it is a very simple nonlinear dynamical equations that generate a random key and EXCLUSIVE OR with the originally transmitted data to protect data through the transmission. At th
... Show MoreThe intellectual property of digital documents has been protected by using many methods of digital watermarking. Digital documents have been so much of advantages over print documents. Digital documents are less expensive and easy to store, transport, and searched compared to traditional print documents. But it has its owner limitation too. A simple image editor can be used to modify and make a forged document. Digital documents can be tampered easily. In order to utilize the whole benefits of digital document, these limitations have to overcome these limitations by embedding some text, logo sequence that identifies the owner of the document..
In this research LSB technique has been used
... Show More