Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
The research deals with A very important two subjects, computer aided process planning (CAPP) and Quality of product with its dimintions which identified by the producer organization, the goal of the research is to Highlight and know the role of the CAPP technology to improve quality of the product of (rotor) in the engines factory in the general company for electrical industries, The research depends case study style by the direct visits of researcher to the work location to apply the operational paths generated by specialized computer program designed by researcher, and research divides into four axes, the first regard to the general structure of the research, the second to the theoretical review, the t
... Show MoreThe research examines the mechanism of application of )ISO 21001: 2018( in the Energy Branch- Electromechanical Engineering at the University of Technology to achieve the quality of the educational service to prepare the branch to obtain the certificate of conformity with the requirements of) ISO 21001: 2018(, the necessary data were collected Depending on the (CHEKLIST) of (ISO 21001: 2018), field interviews and records of the concerned department, The researchers reached a number of results, the most prominent of which was the adoption of high quality leadership leaders and their willingness to implement the standard requirements, The university has a basic structure that qualifies it to implement the international standard, as
... Show Moremethodology six sigma Help to reduce defects by solving problems effectively, and works Lean to reduce losses through the flow of the manufacturing process and when integrating these two methodologies (Lean and six sigma), the methodology of Lean six sigma will form the entrance to the organizers of the optimization process and increase the quality and reduce lead times and costs . by focusing on the needs of the customer. this process uses statistical tools and techniques to analyze and improve processes.
We have conducted this research in the General Company for Electrical Industries and adopted its product (machine cooling water three taps) as a sample for research. In order to determine t
... Show MoreDigital tampering identification, which detects picture modification, is a significant area of image analysis studies. This area has grown with time with exceptional precision employing machine learning and deep learning-based strategies during the last five years. Synthesis and reinforcement-based learning techniques must now evolve to keep with the research. However, before doing any experimentation, a scientist must first comprehend the current state of the art in that domain. Diverse paths, associated outcomes, and analysis lay the groundwork for successful experimentation and superior results. Before starting with experiments, universal image forensics approaches must be thoroughly researched. As a result, this review of variou
... Show MoreThis research aims to clarify the role of Information Technology Competency (ITC) with dimensions' (IT Usage, IT Knowledge, and IT Operations) as an independent variable in the activation of Human Resources Management Practices (HRM Practices) as a dependent variable with dimensions' (Training and Development, Recruitment, Job Design, and Performance appraisal). Based on this, the correlation and effect relationships between the independent and dependent variables are determined by formulating two main hypotheses. There are a significant relationship and effect of IT competency with HRM practices within the dimensions. Furthermore, the scope and population of this research are the Informatics and Communications P
... Show MoreThe effect of the initial pressure upon the laminar flame speed, for a methane-air mixtures, has been detected paractically, for a wide range of equivalence ratio. In this work, a measurement system is designed in order to measure the laminar flame speed using a constant volume method with a thermocouples technique. The laminar burning velocity is measured, by using the density ratio method. The comparison of the present work results and the previous ones show good agreement between them. This indicates that the measurements and the calculations employed in the present work are successful and precise
The need to exchange large amounts of real-time data is constantly increasing in wireless communication. While traditional radio transceivers are not cost-effective and their components should be integrated, software-defined radio (SDR) ones have opened up a new class of wireless technologies with high security. This study aims to design an SDR transceiver was built using one type of modulation, which is 16 QAM, and adding a security subsystem using one type of chaos map, which is a logistic map, because it is a very simple nonlinear dynamical equations that generate a random key and EXCLUSIVE OR with the originally transmitted data to protect data through the transmission. At th
... Show Moreأثبتت الشبكات المحددة بالبرمجيات (SDN) تفوقها في معالجة مشاكل الشبكة العادية مثل قابلية التوسع وخفة الحركة والأمن. تأتي هذه الميزة من SDN بسبب فصل مستوى التحكم عن مستوى البيانات. على الرغم من وجود العديد من الأوراق والدراسات التي تركز على إدارة SDN، والرصد، والتحكم، وتحسين QoS، إلا أن القليل منها يركز على تقديم ما يستخدمونه لتوليد حركة المرور وقياس أداء الشبكة. كما أن المؤلفات تفتقر إلى مقارنات بين الأدوات والأ
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.