Abstract
The research Compared two methods for estimating fourparametersof the compound exponential Weibull - Poisson distribution which are the maximum likelihood method and the Downhill Simplex algorithm. Depending on two data cases, the first one assumed the original data (Non-polluting), while the second one assumeddata contamination. Simulation experimentswere conducted for different sample sizes and initial values of parameters and under different levels of contamination. Downhill Simplex algorithm was found to be the best method for in the estimation of the parameters, the probability function and the reliability function of the compound distribution in cases of natural and contaminateddata.
... Show More
Wireless sensor networks (WSNs) are emerging in various application like military, area monitoring, health monitoring, industry monitoring and many more. The challenges of the successful WSN application are the energy consumption problem. since the small, portable batteries integrated into the sensor chips cannot be re-charged easily from an economical point of view. This work focusses on prolonging the network lifetime of WSNs by reducing and balancing energy consumption during routing process from hop number point of view. In this paper, performance simulation was done between two types of protocols LEACH that uses single hop path and MODLEACH that uses multi hop path by using Intel Care i3 CPU (2.13GHz) laptop with MATLAB (R2014a). Th
... Show MoreThere is an evidence that channel estimation in communication systems plays a crucial issue in recovering the transmitted data. In recent years, there has been an increasing interest to solve problems due to channel estimation and equalization especially when the channel impulse response is fast time varying Rician fading distribution that means channel impulse response change rapidly. Therefore, there must be an optimal channel estimation and equalization to recover transmitted data. However. this paper attempt to compare epsilon normalized least mean square (ε-NLMS) and recursive least squares (RLS) algorithms by computing their performance ability to track multiple fast time varying Rician fading channel with different values of Doppler
... Show MoreImmune-mediated hepatitis is a severe impendence to human health, and no effective treatment is currently available. Therefore, new, safe, low-cost therapies are desperately required. Berbamine (BE), a natural substance obtained primarily from
This c
Tanuma and Zubair formations are known as the most problematic intervals in Zubair Oilfield, and they cause wellbore instability due to possible shale-fluid interaction. It causes a vast loss of time dealing with various downhole problems (e.g., stuck pipe) which leads to an increase in overall well cost for the consequences (e.g., fishing and sidetrack). This paper aims to test shale samples with various laboratory tests for shale evaluation and drilling muds development. Shale's physical properties are described by using a stereomicroscope and the structures are observed with Scanning Electron Microscope. The shale reactivity and behavior are analyzed by using the cation exchange capacity testing and the capillary suction test is
... Show MoreThis study dealt with the reality of groundwater before Sulfur production in the Al-Mishraq field-1 and after production stopped in the field, by measuring the groundwater table for (44) wells in 2021, and comparing it to the groundwater table measured by the Polish company Centrozap in 1971, the groundwater table was a range between (187.71-205.8)m in 1971, but in 2021 it ranged was between (188.51-196.55)m.
Maps of the groundwater movement and water table were created using these data. It turned out that there was little change in the direction of groundwater flow; in both cases, the flow is from the west and northwest towards the east with a slight slope toward the southeast and the Tigris River. As for hydraulic properties, it w
Feature selection (FS) constitutes a series of processes used to decide which relevant features/attributes to include and which irrelevant features to exclude for predictive modeling. It is a crucial task that aids machine learning classifiers in reducing error rates, computation time, overfitting, and improving classification accuracy. It has demonstrated its efficacy in myriads of domains, ranging from its use for text classification (TC), text mining, and image recognition. While there are many traditional FS methods, recent research efforts have been devoted to applying metaheuristic algorithms as FS techniques for the TC task. However, there are few literature reviews concerning TC. Therefore, a comprehensive overview was systematicall
... Show More