Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
Digital tampering identification, which detects picture modification, is a significant area of image analysis studies. This area has grown with time with exceptional precision employing machine learning and deep learning-based strategies during the last five years. Synthesis and reinforcement-based learning techniques must now evolve to keep with the research. However, before doing any experimentation, a scientist must first comprehend the current state of the art in that domain. Diverse paths, associated outcomes, and analysis lay the groundwork for successful experimentation and superior results. Before starting with experiments, universal image forensics approaches must be thoroughly researched. As a result, this review of variou
... Show MoreThe importance of the study stems from the fact that it deals with a very important subject, namely, the pivotal role played by E-banking in achieving the competitive advantage of the banking institutions operating in Algeria. By adopting the banking institution and adopting the elements of the electronic marketing mix and developing it as required by the environment The banking system of developments will inevitably be able to achieve excellence from its competitors as each of these elements have an important role in achieving competitive advantage, we relied in this study on studies and research that directly affect the problem of the study and we have put Estep In order to activate the contribution of e-banking in achieving competitiv
... Show MoreThe research aims to improve operational performance through the application of the Holonic Manufacturing System (HMS) in the rubber products factory in Najaf. The problem was diagnosed with the weakness of the manufacturing system in the factory to meet customers' demands on time within the available resources of machines and workers, which led to time delays of Processing and delivery, increased costs, and reduced flexibility in the factory, A case study methodology used to identify the reality of the manufacturing system and the actual operational performance in the factory. The simulation was used to represent the proposed (HMS) by using (Excel 2010) based on the actual data and calculate the operational performance measures
... Show MoreThe research examines the mechanism of application of )ISO 21001: 2018( in the Energy Branch- Electromechanical Engineering at the University of Technology to achieve the quality of the educational service to prepare the branch to obtain the certificate of conformity with the requirements of) ISO 21001: 2018(, the necessary data were collected Depending on the (CHEKLIST) of (ISO 21001: 2018), field interviews and records of the concerned department, The researchers reached a number of results, the most prominent of which was the adoption of high quality leadership leaders and their willingness to implement the standard requirements, The university has a basic structure that qualifies it to implement the international standard, as
... Show Moremethodology six sigma Help to reduce defects by solving problems effectively, and works Lean to reduce losses through the flow of the manufacturing process and when integrating these two methodologies (Lean and six sigma), the methodology of Lean six sigma will form the entrance to the organizers of the optimization process and increase the quality and reduce lead times and costs . by focusing on the needs of the customer. this process uses statistical tools and techniques to analyze and improve processes.
We have conducted this research in the General Company for Electrical Industries and adopted its product (machine cooling water three taps) as a sample for research. In order to determine t
... Show MoreThe study aims to examine the problem of forced displacement and its social and economic problems in light of the Syrian crisis. Such an aim helps to know the difficulties and challenges facing the children of displaced families in learning, and the reasons for their lack of enrolment. It also clarifies whether there are significant statistical differences at among the attitudes of the children of the displaced families towards education regarding the following variables: (the work of the head of the family, the economic level of the family, and the work of the children). The study has adopted the descriptive-analytical approach; a questionnaire was adopted as a tool to collect information. The study was applied to a sample o
... Show MoreThis paper addresses the nature of Spatial Data Infrastructure (SDI), considered as one of the most important concepts to ensure effective functioning in a modern society. It comprises a set of continually developing methods and procedures providing the geospatial base supporting a country’s governmental, environmental, economic, and social activities. In general, the SDI framework consists of the integration of various elements including standards, policies, networks, data, and end users and application areas. The transformation of previously paper-based map data into a digital format, the emergence of GIS, and the Internet and a host of online applications (e.g., environmental impact analysis, navigation, applications of VGI dat
... Show MoreA skip list data structure is really just a simulation of a binary search tree. Skip lists algorithm are simpler, faster and use less space. this data structure conceptually uses parallel sorted linked lists. Searching in a skip list is more difficult than searching in a regular sorted linked list. Because a skip list is a two dimensional data structure, it is implemented using a two dimensional network of nodes with four pointers. the implementation of the search, insert and delete operation taking a time of upto . The skip list could be modified to implement the order statistic operations of RANKand SEARCH BY RANK while maintaining the same expected time. Keywords:skip list , parallel linked list , randomized algorithm , rank.