Empirical and statistical methodologies have been established to acquire accurate permeability identification and reservoir characterization, based on the rock type and reservoir performance. The identification of rock facies is usually done by either using core analysis to visually interpret lithofacies or indirectly based on well-log data. The use of well-log data for traditional facies prediction is characterized by uncertainties and can be time-consuming, particularly when working with large datasets. Thus, Machine Learning can be used to predict patterns more efficiently when applied to large data. Taking into account the electrofacies distribution, this work was conducted to predict permeability for the four wells, FH1, FH2, FH3, and FH19 from the Yamama reservoir in the Faihaa Oil Field, southern Iraq. The framework includes: calculating permeability for uncored wells using the classical method and FZI method. Topological mapping of input space into clusters is achieved using the self-organizing map (SOM), as an unsupervised machine-learning technique. By leveraging data obtained from the four wells, the SOM is effectively employed to forecast the count of electrofacies present within the reservoir. According to the findings, the permeability calculated using the classical method that relies exclusively on porosity is not close enough to the actual values because of the heterogeneity of carbonate reservoirs. Using the FZI method, in contrast, displays more real values and offers the best correlation coefficient. Then, the SOM model and cluster analysis reveal the existence of five distinct groups.
Ensuring reliable data transmission in Network on Chip (NoC) is one of the most challenging tasks, especially in noisy environments. As crosstalk, interference, and radiation were increased with manufacturers' increasing tendency to reduce the area, increase the frequencies, and reduce the voltages. So many Error Control Codes (ECC) were proposed with different error detection and correction capacities and various degrees of complexity. Code with Crosstalk Avoidance and Error Correction (CCAEC) for network-on-chip interconnects uses simple parity check bits as the main technique to get high error correction capacity. Per this work, this coding scheme corrects up to 12 random errors, representing a high correction capac
... Show MoreThe concept of quality of auditing profession comes on top of the concerns of the international business community and international institutions particularly now following the impact of the several failures and financial hardships suffered by the major companies in the recent collapse of money markets in some countries of the world and fear of their recurrence in the future.An observer of the local and international rules and standards (or principles) finds that these include such implications have direct or indirect effects on the performance of the service of the accountant and auditor, which should upgrade their professional performance in these services to a high level of quality so as to be in line with the requirements, principles
... Show MoreThe development of a future mechanism for sustainable development in Iraq to meet the current and future challenges requires an analysis of the indicators of sustainable development. This research aims at presenting and analyzing the social care aspect and highlighting the important role of taxes with a focus on social sustainable development to determine the extent and direction of changes. Level of progress, the researcher concludes the weakness of the financial allocations to the Ministry of Labor and Social Affairs and in line with the large number of people who apply the conditions and controls, and recommends the researcher the necessity of participation of all segments of society between the public and private sector In terms of o
... Show MoreError control schemes became a necessity in network-on-chip (NoC) to improve reliability as the on-chip interconnect errors increase with the continuous shrinking of geometry. Accordingly, many researchers are trying to present multi-bit error correction coding schemes that perform a high error correction capability with the simplest design possible to minimize area and power consumption. A recent work, Multi-bit Error Correcting Coding with Reduced Link Bandwidth (MECCRLB), showed a huge reduction in area and power consumption compared to a well-known scheme, namely, Hamming product code (HPC) with Type-II HARQ. Moreover, the authors showed that the proposed scheme can correct 11 random errors which is considered a high
... Show MoreThe Weibull distribution is considered one of the Type-I Generalized Extreme Value (GEV) distribution, and it plays a crucial role in modeling extreme events in various fields, such as hydrology, finance, and environmental sciences. Bayesian methods play a strong, decisive role in estimating the parameters of the GEV distribution due to their ability to incorporate prior knowledge and handle small sample sizes effectively. In this research, we compare several shrinkage Bayesian estimation methods based on the squared error and the linear exponential loss functions. They were adopted and compared by the Monte Carlo simulation method. The performance of these methods is assessed based on their accuracy and computational efficiency in estimati
... Show MoreIn this paper, we present a comparison of double informative priors which are assumed for the parameter of inverted exponential distribution.To estimate the parameter of inverted exponential distribution by using Bayes estimation ,will be used two different kind of information in the Bayes estimation; two different priors have been selected for the parameter of inverted exponential distribution. Also assumed Chi-squared - Gamma distribution, Chi-squared - Erlang distribution, and- Gamma- Erlang distribution as double priors. The results are the derivations of these estimators under the squared error loss function with three different double priors.
Additionally Maximum likelihood estimation method
... Show MoreThe goal (purpose) from using development technology that require mathematical procedure related with high Quality & sufficiency of solving complex problem called Dynamic Programming with in recursive method (forward & backward) through finding series of associated decisions for reliability function of Pareto distribution estimator by using two approach Maximum likelihood & moment .to conclude optimal policy
In this study, the researcher aims to analyze the content of the physics textbook for the 3rd intermediate grade according to the criteria for designing and producing infographics, and the research community consists of the content of the physics textbook for the 3rd intermediate grade intermediate grade for the academic year 2021-2022. The researcher adopted the analysis instruments with a number of the criteria for designing and producing infographics. The results revealed randomness in the percentage of the criteria included in the content of the physics textbook for the 3rd intermediate grade, and they are not compatible with the proposed criteria by the experts also.