Crude oil is one of the most important sources of energy in the world. To extract its multiple components, we need oil refineries. Refineries consist of multiple parts, including heat exchangers, furnaces, and others. It is known that one of the initial operations in the refineries is the process of gradually raising the temperature of crude oil to 370 degrees centigrade or higher. Hence, in this investigation the focus is on the furnaces and the corrosion in their tubes. The investigation was accomplished by reading the thickness of the tubes for the period from 2008 to 2020 with a test in every two year, had passed from their introduction into the work. Where the thickness of more than one point was measured on each tube in the sa
... Show MoreChromium tanned leather wastes (CTLW) and vegetable tanned leather wastes (VTLW) were used as adsorbent materials to remove the Biebrich scarlet dye (BS), as an anionic dye from wastewater, using an adsorption method. The effects of various factors, such as weight of leather waste, time of shaking, and the starting concentration of Biebrich scarlet dye, temperature and pH were studied. It described the adsorption process using Langmuir and Freundlich isotherm models. The obtained results agreed well with the Langmuir model, and the maximum adsorption capacities of CTLW and VTLW were 73.5294 and 78.1250 mg.g⁻¹, respectively, suggesting a monolayer adsorption process. The adsorption kinetic was found to follow a pseudo-second-order kinetic
... Show MoreToday, the success or failure of organizations depends to possess the wisdom of their managers promised that the key to organizational success of the business environment, making the right decisions, and create the ability to work and think towards discrimination of products and services the organization . Seek this research to investigation the relationship between the wisdom management and differentiation strategy for service operations . It was a test of that relationship in light of the results of the analysis of the data collected through the questionnaire distributed on a sample from (98) Director Mangers, head of department and head of division in the General Establishment of Civil Aviation . The research used descriptive st
... Show MorePermeability data has major importance work that should be handled in all reservoir simulation studies. The importance of permeability data increases in mature oil and gas fields due to its sensitivity for the requirements of some specific improved recoveries. However, the industry has a huge source of data of air permeability measurements against little number of liquid permeability values. This is due to the relatively high cost of special core analysis.
The current study suggests a correlation to convert air permeability data that are conventionally measured during laboratory core analysis into liquid permeability. This correlation introduces a feasible estimation in cases of data loose and poorly consolidated formations, or in cas
The non static chain is always the problem of static analysis so that explained some of theoretical work, the properties of statistical regression analysis to lose when using strings in statistic and gives the slope of an imaginary relation under consideration. chain is not static can become static by adding variable time to the multivariate analysis the factors to remove the general trend as well as variable placebo seasons to remove the effect of seasonal .convert the data to form exponential or logarithmic , in addition to using the difference repeated d is said in this case it integrated class d. Where the research contained in the theoretical side in parts in the first part the research methodology ha
... Show MoreThis research aims to analyze and simulate biochemical real test data for uncovering the relationships among the tests, and how each of them impacts others. The data were acquired from Iraqi private biochemical laboratory. However, these data have many dimensions with a high rate of null values, and big patient numbers. Then, several experiments have been applied on these data beginning with unsupervised techniques such as hierarchical clustering, and k-means, but the results were not clear. Then the preprocessing step performed, to make the dataset analyzable by supervised techniques such as Linear Discriminant Analysis (LDA), Classification And Regression Tree (CART), Logistic Regression (LR), K-Nearest Neighbor (K-NN), Naïve Bays (NB
... Show MoreVisual analytics becomes an important approach for discovering patterns in big data. As visualization struggles from high dimensionality of data, issues like concept hierarchy on each dimension add more difficulty and make visualization a prohibitive task. Data cube offers multi-perspective aggregated views of large data sets and has important applications in business and many other areas. It has high dimensionality, concept hierarchy, vast number of cells, and comes with special exploration operations such as roll-up, drill-down, slicing and dicing. All these issues make data cubes very difficult to visually explore. Most existing approaches visualize a data cube in 2D space and require preprocessing steps. In this paper, we propose a visu
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show MoreThe aim of this study is to estimate the parameters and reliability function for kumaraswamy distribution of this two positive parameter (a,b > 0), which is a continuous probability that has many characterstics with the beta distribution with extra advantages.
The shape of the function for this distribution and the most important characterstics are explained and estimated the two parameter (a,b) and the reliability function for this distribution by using the maximum likelihood method (MLE) and Bayes methods. simulation experiments are conducts to explain the behaviour of the estimation methods for different sizes depending on the mean squared error criterion the results show that the Bayes is bet
... Show More