This research includes the study of dual data models with mixed random parameters, which contain two types of parameters, the first is random and the other is fixed. For the random parameter, it is obtained as a result of differences in the marginal tendencies of the cross sections, and for the fixed parameter, it is obtained as a result of differences in fixed limits, and random errors for each section. Accidental bearing the characteristic of heterogeneity of variance in addition to the presence of serial correlation of the first degree, and the main objective in this research is the use of efficient methods commensurate with the paired data in the case of small samples, and to achieve this goal, the feasible general least squares method (FGLS) and the mean group method (MG) were used, and then the efficiency of the extracted estimators was compared in the case of mixed random parameters and the method that gives us the efficient estimator was chosen. Real data was applied that included the per capita consumption of electric energy (Y) for five countries, which represents the number of cross-sections (N = 5) over nine years (T = 9), so the number of observations is (n = 45) observations, and the explanatory variables are the consumer price index (X1) and the per capita GDP (X2). To evaluate the performance of the estimators of the (FGLS) method and the (MG) method on the general model, the mean absolute percentage error (MAPE) scale was used to compare the efficiency of the estimators. The results showed that the mean group estimation (MG) method is the best method for parameter estimation than the (FGLS) method. Also, the (MG) appeared to be the best and best method for estimating sub-parameters for each cross-section (country).
In this article, we will present a quasi-contraction mapping approach for D iteration, and we will prove that this iteration with modified SP iteration has the same convergence rate. At the other hand, we prove that the D iteration approach for quasi-contraction maps is faster than certain current leading iteration methods such as, Mann and Ishikawa. We are giving a numerical example, too.
This paper studies a novel technique based on the use of two effective methods like modified Laplace- variational method (MLVIM) and a new Variational method (MVIM)to solve PDEs with variable coefficients. The current modification for the (MLVIM) is based on coupling of the Variational method (VIM) and Laplace- method (LT). In our proposal there is no need to calculate Lagrange multiplier. We applied Laplace method to the problem .Furthermore, the nonlinear terms for this problem is solved using homotopy method (HPM). Some examples are taken to compare results between two methods and to verify the reliability of our present methods.
The research problem has crystallized and in light of these capabilities, the level of performance depends on the application of modern training methods based on actual experimentation, and those methods aim to develop the components of achievement in this competition, including the quantities of exerting the distinctive strength with speed for the arms and feet, which reflects on good skillful performance because the skill of shooting by jumping forward and high forms A major role in achieving goals during the competition that qualifies the team to win, and through the follow-up of the researcher in the field and academic field, I noticed that there is a weakness in some physical abilities, which affects performance and skill level
... Show MoreInformation systems and data exchange between government institutions are growing rapidly around the world, and with it, the threats to information within government departments are growing. In recent years, research into the development and construction of secure information systems in government institutions seems to be very effective. Based on information system principles, this study proposes a model for providing and evaluating security for all of the departments of government institutions. The requirements of any information system begin with the organization's surroundings and objectives. Most prior techniques did not take into account the organizational component on which the information system runs, despite the relevance of
... Show MoreCorpus linguistics is a methodology in studying language through corpus-based research. It differs from a traditional approach in studying a language (prescriptive approach) in its insistence on the systematic study of authentic examples of language in use (descriptive approach).A “corpus” is a large body of machine-readable structurally collected naturally occurring linguistic data, either written texts or a transcription of recorded speech, which can be used as a starting-point of linguistic description or as a means of verifying hypotheses about a language. In the past decade, interest has grown tremendously in the use of language corpora for language education. The ways in which corpora have been employed in language pedago
... Show MoreCloud computing provides huge amount of area for storage of the data, but with an increase of number of users and size of their data, cloud storage environment faces earnest problem such as saving storage space, managing this large data, security and privacy of data. To save space in cloud storage one of the important methods is data deduplication, it is one of the compression technique that allows only one copy of the data to be saved and eliminate the extra copies. To offer security and privacy of the sensitive data while supporting the deduplication, In this work attacks that exploit the hybrid cloud deduplication have been identified, allowing an attacker to gain access to the files of other users based on very small hash signatures of
... Show MoreIn this research, a factorial experiment (4*4) was studied, applied in a completely random block design, with a size of observations, where the design of experiments is used to study the effect of transactions on experimental units and thus obtain data representing experiment observations that The difference in the application of these transactions under different environmental and experimental conditions It causes noise that affects the observation value and thus an increase in the mean square error of the experiment, and to reduce this noise, multiple wavelet reduction was used as a filter for the observations by suggesting an improved threshold that takes into account the different transformation levels based on the logarithm of the b
... Show MoreTor (The Onion Routing) network was designed to enable users to browse the Internet anonymously. It is known for its anonymity and privacy security feature against many agents who desire to observe the area of users or chase users’ browsing conventions. This anonymity stems from the encryption and decryption of Tor traffic. That is, the client’s traffic should be subject to encryption and decryption before the sending and receiving process, which leads to delay and even interruption in data flow. The exchange of cryptographic keys between network devices plays a pivotal and critical role in facilitating secure communication and ensuring the integrity of cryptographic procedures. This essential process is time-consuming, which causes del
... Show More