Database is characterized as an arrangement of data that is sorted out and disseminated in a way that allows the client to get to the data being put away in a simple and more helpful way. However, in the era of big-data the traditional methods of data analytics may not be able to manage and process the large amount of data. In order to develop an efficient way of handling big-data, this work studies the use of Map-Reduce technique to handle big-data distributed on the cloud. This approach was evaluated using Hadoop server and applied on EEG Big-data as a case study. The proposed approach showed clear enhancement for managing and processing the EEG Big-data with average of 50% reduction on response time. The obtained results provide EEG researchers and specialist with an easy and fast method of handling the EEG big data.
The combination of wavelet theory and neural networks has lead to the development of wavelet networks. Wavelet networks are feed-forward neural networks using wavelets as activation function. Wavelets networks have been used in classification and identification problems with some success.
In this work we proposed a fuzzy wavenet network (FWN), which learns by common back-propagation algorithm to classify medical images. The library of medical image has been analyzed, first. Second, Two experimental tables’ rules provide an excellent opportunity to test the ability of fuzzy wavenet network due to the high level of information variability often experienced with this type of images.
&n
... Show MoreThe study aims to use the European Excellence Model (EFQM) in assessing the institutional performance of the National Center for Administrative Development and Information Technology in order to determine the gap between the actual reality of the performance of the Center and the standards adopted in the model, in order to know the extent to which the Center seeks to achieve excellence in performance to improve the level of services provided and the adoption of methods Modern and contemporary management in the evaluation of its institutional performance.
The problem of the study was the absence of an institutional performance evaluation system at the centre whereby weaknesses (areas of improvement) and st
... Show MoreVariable selection is an essential and necessary task in the statistical modeling field. Several studies have triedto develop and standardize the process of variable selection, but it isdifficultto do so. The first question a researcher needs to ask himself/herself what are the most significant variables that should be used to describe a given dataset’s response. In thispaper, a new method for variable selection using Gibbs sampler techniqueshas beendeveloped.First, the model is defined, and the posterior distributions for all the parameters are derived.The new variable selection methodis tested usingfour simulation datasets. The new approachiscompared with some existingtechniques: Ordinary Least Squared (OLS), Least Absolute Shrinkage
... Show MoreEnhancing quality image fusion was proposed using new algorithms in auto-focus image fusion. The first algorithm is based on determining the standard deviation to combine two images. The second algorithm concentrates on the contrast at edge points and correlation method as the criteria parameter for the resulted image quality. This algorithm considers three blocks with different sizes at the homogenous region and moves it 10 pixels within the same homogenous region. These blocks examine the statistical properties of the block and decide automatically the next step. The resulted combined image is better in the contras
... Show MoreIn recent years, the performance of Spatial Data Infrastructures for governments and companies is a task that has gained ample attention. Different categories of geospatial data such as digital maps, coordinates, web maps, aerial and satellite images, etc., are required to realize the geospatial data components of Spatial Data Infrastructures. In general, there are two distinct types of geospatial data sources exist over the Internet: formal and informal data sources. Despite the growth of informal geospatial data sources, the integration between different free sources is not being achieved effectively. The adoption of this task can be considered the main advantage of this research. This article addresses the research question of how the
... Show MoreIn recent years, data centre (DC) networks have improved their rapid exchanging abilities. Software-defined networking (SDN) is presented to alternate the impression of conventional networks by segregating the control plane from the SDN data plane. The SDN presented overcomes the limitations of traditional DC networks caused by the rapidly incrementing amounts of apps, websites, data storage needs, etc. Software-defined networking data centres (SDN-DC), based on the open-flow (OF) protocol, are used to achieve superior behaviour for executing traffic load-balancing (LB) jobs. The LB function divides the traffic-flow demands between the end devices to avoid links congestion. In short, SDN is proposed to manage more operative configur
... Show MoreOnline service is used to be as Pay-Per-Use in Cloud computing. Service user need not be in a long time contract with cloud service providers. Service level agreements (SLAs) are understandings marked between a cloud service providers and others, for example, a service user, intermediary operator, or observing operators. Since cloud computing is an ongoing technology giving numerous services to basic business applications and adaptable systems to manage online agreements are significant. SLA maintains the quality-of-service to the cloud user. If service provider fails to maintain the required service SLA is considered to be SLA violated. The main aim is to minimize the SLA violations for maintain the QoS of their cloud users. In this res
... Show MoreThe issue of penalized regression model has received considerable critical attention to variable selection. It plays an essential role in dealing with high dimensional data. Arctangent denoted by the Atan penalty has been used in both estimation and variable selection as an efficient method recently. However, the Atan penalty is very sensitive to outliers in response to variables or heavy-tailed error distribution. While the least absolute deviation is a good method to get robustness in regression estimation. The specific objective of this research is to propose a robust Atan estimator from combining these two ideas at once. Simulation experiments and real data applications show that the p
... Show MoreLattakia city faces many problems related to the mismanagement of solid waste, as the disposal process is limited to the random Al-Bassa landfill without treatment. Therefore, solid waste management poses a special challenge to decision-makers by choosing the appropriate tool that supports strategic decisions in choosing municipal solid waste treatment methods and evaluating their management systems. As the human is primarily responsible for the formation of waste, this study aims to measure the degree of environmental awareness in the Lattakia Governorate from the point of view of the research sample members and to discuss the effect of the studied variables (place of residence, educational level, gender, age, and professional status) o
... Show MoreThe research aims to form a clear theoretical philosophy and perceptions about strategic Entrepreneurship through the relationship between high Involvement management practices, the basis in creating that leadership and high-performance work systems as a support tool in achieving them according to the proposals (Hitt et al, 2011), in an attempt to generalize theoretical philosophy and put forward how to apply it within The Iraqi environment, and on this basis the problem of the current research was launched to bridge the knowledge gap between the previous proposals and the possibility of their application, aiming to identify the practices of high Involvement management and the possibility of high-performance work systems and thei
... Show More