Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
Reliable estimation of critical parameters such as hydrocarbon pore volume, water saturation, and recovery factor are essential for accurate reserve assessment. The inherent uncertainties associated with these parameters encompass a reasonable range of estimated recoverable volumes for single accumulations or projects. Incorporating this uncertainty range allows for a comprehensive understanding of potential outcomes and associated risks. In this study, we focus on the oil field located in the northern part of Iraq and employ a Monte Carlo based petrophysical uncertainty modeling approach. This method systematically considers various sources of error and utilizes effective interpretation techniques. Leveraging the current state of a
... Show MoreThe last ten years observed a shift enormous scientific in the method and way that it deals professional with the cost accounting and reflected the result those shift enormous scientific of increase the competitive environmental that accompanied the emergence of a modern manufacturing environmental on surface the long roductive life and emergence advanced information technology that give a central focus of his important on client with growing global markets growth on a large scale.
The research aim to define the concept of cost awareness, the concept and methods of strategic cost management and the role of cost awareness for managers of industrial units in strategic of cost managem
... Show MoreThis paper is dealing with an experimental study to show the influence of the geometric characteristics of the vortex generators VG son the thickness of the boundary layer (∂) and drag coefficients (CD) of the flat plate. Vortex generators work effectively on medium and high angles of attack, since they are "hidden" under the boundary layer and practically ineffective at low angles.
The height of VGs relative to the thickness of the boundary layer enables us to study the efficacy of VGs in delaying boundary layer separation. The distance between two VGs also has an effect on the boundary layer if we take into
... Show MoreA procedure for the mutual derivatization and determination of thymol and Dapsone was developed and validated in this study. Dapsone was used as the derivatizing agent for the determination of thymol, and thymol was used as the derivatizing agent for the determination of Dapsone. An optimization study was performed for the derivatization reaction; i.e., the diazonium coupling reaction. Linear regression calibration plots for thymol and Dapsone in the direct reaction were constructed at 460 nm, within the concentration range of 0.3-7 μg ml-1 for thymol and 0.3-4 μg ml-1 for Dapsone, with limits of detection 0.086 and 0.053 μg ml-1, respectively. Corresponding plots for the cloud point extraction of thymol and Dapsone were constructed
... Show MoreRecommender Systems are tools to understand the huge amount of data available in the internet world. Collaborative filtering (CF) is one of the most knowledge discovery methods used positively in recommendation system. Memory collaborative filtering emphasizes on using facts about present users to predict new things for the target user. Similarity measures are the core operations in collaborative filtering and the prediction accuracy is mostly dependent on similarity calculations. In this study, a combination of weighted parameters and traditional similarity measures are conducted to calculate relationship among users over Movie Lens data set rating matrix. The advantages and disadvantages of each measure are spotted. From the study, a n
... Show MoreIn this work, functionally graded materials were synthesized by centrifugal technique at different
volume fractions 0.5, 1, 1.5, and 2% Vf with a rotation speed of 1200 rpm and a constant rotation time, T
= 6 min . The mechanical properties were characterized to study the graded and non-graded nanocomposites
and the pure epoxy material. The mechanical tests showed that graded and non-graded added alumina
(Al2O3) nanoparticles enhanced the effect more than pure epoxy. The maximum difference in impact strength
occurred at (FGM), which was loaded from the rich side of the nano-alumina where the maximum value was
at 1% Vf by 133.33% of the sample epoxy side. The flexural strength and Young modulus of the fu
The increasing complexity of assaults necessitates the use of innovative intrusion detection systems (IDS) to safeguard critical assets and data. There is a higher risk of cyberattacks like data breaches and unauthorised access since cloud services have been used more frequently. The project's goal is to find out how Artificial Intelligence (AI) could enhance the IDS's ability to identify and classify network traffic and identify anomalous activities. Online dangers could be identified with IDS. An intrusion detection system, or IDS, is required to keep networks secure. We must create efficient IDS for the cloud platform as well, since it is constantly growing and permeating more aspects of our daily life. However, using standard intrusion
... Show More