The current paper proposes a new estimator for the linear regression model parameters under Big Data circumstances. From the diversity of Big Data variables comes many challenges that can be interesting to the researchers who try their best to find new and novel methods to estimate the parameters of linear regression model. Data has been collected by Central Statistical Organization IRAQ, and the child labor in Iraq has been chosen as data. Child labor is the most vital phenomena that both society and education are suffering from and it affects the future of our next generation. Two methods have been selected to estimate the parameter of linear regression model, one Covariate at a Time Multiple Testing OCMT. Moreover, the Euclidian Distance has been used as a comparison criterion among the three methods
Ultimate oil recovery and displacement efficiency at the pore-scale are controlled by the rock wettability thus there is a growing interest in the wetting behaviour of reservoir rocks as production from fractured oil-wet or mixed-wet limestone formations have remained a key challenge. Conventional waterflooding methods are inefficient in such formation due to poor spontaneous imbibition of water into the oil-wet rock capillaries. However, altering the wettability to water-wet could yield recovery of significant amounts of additional oil thus this study investigates the influence of nanoparticles on wettability alteration. The efficiency of various formulated zirconium-oxide (ZrO2) based nanofluids at different nanoparticle concentrations (0
... Show MoreNumeral recognition is considered an essential preliminary step for optical character recognition, document understanding, and others. Although several handwritten numeral recognition algorithms have been proposed so far, achieving adequate recognition accuracy and execution time remain challenging to date. In particular, recognition accuracy depends on the features extraction mechanism. As such, a fast and robust numeral recognition method is essential, which meets the desired accuracy by extracting the features efficiently while maintaining fast implementation time. Furthermore, to date most of the existing studies are focused on evaluating their methods based on clean environments, thus limiting understanding of their potential a
... Show MoreCarbon-fiber-reinforced polymer (CFRP) is widely acknowledged as a leading advanced material structure, offering superior properties compared to traditional materials, and has found diverse applications in several industrial sectors, such as that of automobiles, aircrafts, and power plants. However, the production of CFRP composites is prone to fabrication problems, leading to structural defects arising from cycling and aging processes. Identifying these defects at an early stage is crucial to prevent service issues that could result in catastrophic failures. Hence, routine inspection and maintenance are crucial to prevent system collapse. To achieve this objective, conventional nondestructive testing (NDT) methods are utilized to i
... Show MoreOne of the principle concepts to understand any hydrocarbon field is the heterogeneity scale; This becomes particularly challenging in supergiant oil fields with medium to low lateral connectivity and carbonate reservoir rocks.
The main objectives of this study is to quantify the value of the heterogeneity for any well in question, and propagate it to the full reservoir. This is a quite useful specifically prior to conducting detailed water flooding or full field development studies and work, in order to be prepared for a proper design and exploitation requirements that fit with the level of heterogeneity of this formation.
Computer vision seeks to mimic the human visual system and plays an essential role in artificial intelligence. It is based on different signal reprocessing techniques; therefore, developing efficient techniques becomes essential to achieving fast and reliable processing. Various signal preprocessing operations have been used for computer vision, including smoothing techniques, signal analyzing, resizing, sharpening, and enhancement, to reduce reluctant falsifications, segmentation, and image feature improvement. For example, to reduce the noise in a disturbed signal, smoothing kernels can be effectively used. This is achievedby convolving the distributed signal with smoothing kernels. In addition, orthogonal moments (OMs) are a cruc
... Show MoreECG is an important tool for the primary diagnosis of heart diseases, which shows the electrophysiology of the heart. In our method, a single maternal abdominal ECG signal is taken as an input signal and the maternal P-QRS-T complexes of original signal is averaged and repeated and taken as a reference signal. LMS and RLS adaptive filters algorithms are applied. The results showed that the fetal ECGs have been successfully detected. The accuracy of Daisy database was up to 84% of LMS and 88% of RLS while PhysioNet was up to 98% and 96% for LMS and RLS respectively.