The issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the proposed LAD-Atan estimator has superior performance compared with other estimators.
The physical and elastic characteristics of rocks determine rock strengths in general. Rock strength is frequently assessed using porosity well logs such as neutron and sonic logs. The essential criteria for estimating rock mechanic parameters in petroleum engineering research are uniaxial compressive strength and elastic modulus. Indirect estimation using well-log data is necessary to measure these variables. This study attempts to create a single regression model that can accurately forecast rock mechanic characteristics for the Harth Carbonate Formation in the Fauqi oil field. According to the findings of this study, petrophysical parameters are reliable indexes for determining rock mechanical properties having good performance p
... Show MoreA metal mandrel was designed for manufacturing the cathodes of high power electron tube ( Tetrode ) used in broadcasting transmitting tubes type TH558 and CQS200.The cathodes were manufactured in the present work from thoriated tungsten wires ( 2? ThO2- W) with different diameters .These cathodes were carbonized in sequences of processes to determine the carbonization parameters (temperature, pressure, time, current and voltage).Then the carbonized cathodes dimension were accurately measured to determine the deviation due to the high temperature distortion effect at about 1800°C .the distorted cathodes due to the carbonization process was treated when it was subjected inside the vacuum chamber and heat treated again .The carbonized cat
... Show MoreThe using of the parametric models and the subsequent estimation methods require the presence of many of the primary conditions to be met by those models to represent the population under study adequately, these prompting researchers to search for more flexible models of parametric models and these models were nonparametric models.
In this manuscript were compared to the so-called Nadaraya-Watson estimator in two cases (use of fixed bandwidth and variable) through simulation with different models and samples sizes. Through simulation experiments and the results showed that for the first and second models preferred NW with fixed bandwidth fo
... Show MoreToday, problems of spatial data integration have been further complicated by the rapid development in communication technologies and the increasing amount of available data sources on the World Wide Web. Thus, web-based geospatial data sources can be managed by different communities and the data themselves can vary in respect to quality, coverage, and purpose. Integrating such multiple geospatial datasets remains a challenge for geospatial data consumers. This paper concentrates on the integration of geometric and classification schemes for official data, such as Ordnance Survey (OS) national mapping data, with volunteered geographic information (VGI) data, such as the data derived from the OpenStreetMap (OSM) project. Useful descriptions o
... Show MoreBig data analysis is essential for modern applications in areas such as healthcare, assistive technology, intelligent transportation, environment and climate monitoring. Traditional algorithms in data mining and machine learning do not scale well with data size. Mining and learning from big data need time and memory efficient techniques, albeit the cost of possible loss in accuracy. We have developed a data aggregation structure to summarize data with large number of instances and data generated from multiple data sources. Data are aggregated at multiple resolutions and resolution provides a trade-off between efficiency and accuracy. The structure is built once, updated incrementally, and serves as a common data input for multiple mining an
... Show MoreThe purpose of this paper is applying the robustness in Linear programming(LP) to get rid of uncertainty problem in constraint parameters, and find the robust optimal solution, to maximize the profits of the general productive company of vegetable oils for the year 2019, through the modify on a mathematical model of linear programming when some parameters of the model have uncertain values, and being processed it using robust counterpart of linear programming to get robust results from the random changes that happen in uncertain values of the problem, assuming these values belong to the uncertainty set and selecting the values that cause the worst results and to depend buil
... Show MoreThis article aims to explore the importance of estimating the a semiparametric regression function ,where we suggest a new estimator beside the other combined estimators and then we make a comparison among them by using simulation technique . Through the simulation results we find that the suggest estimator is the best with the first and second models ,wherealse for the third model we find Burman and Chaudhuri (B&C) is best.
This research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MorePrediction of the formation of pore and fracture pressure before constructing a drilling wells program are a crucial since it helps to prevent several drilling operations issues including lost circulation, kick, pipe sticking, blowout, and other issues. IP (Interactive Petrophysics) software is used to calculate and measure pore and fracture pressure. Eaton method, Matthews and Kelly, Modified Eaton, and Barker and Wood equations are used to calculate fracture pressure, whereas only Eaton method is used to measure pore pressure. These approaches are based on log data obtained from six wells, three from the north dome; BUCN-52, BUCN-51, BUCN-43 and the other from the south dome; BUCS-49, BUCS-48, BUCS-47. Along with the overburden pressur
... Show MorePrediction of the formation of pore and fracture pressure before constructing a drilling wells program are a crucial since it helps to prevent several drilling operations issues including lost circulation, kick, pipe sticking, blowout, and other issues. IP (Interactive Petrophysics) software is used to calculate and measure pore and fracture pressure. Eaton method, Matthews and Kelly, Modified Eaton, and Barker and Wood equations are used to calculate fracture pressure, whereas only Eaton method is used to measure pore pressure. These approaches are based on log data obtained from six wells, three from the north dome; BUCN-52, BUCN-51, BUCN-43 and the other from the south dome; BUCS-49, BUCS-48, BUCS-47. Along with the overburden pr
... Show More